Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

A quality model for actionable analytics in rapids software development

45 views

Published on

SEAA 2018

Published in: Software
  • Be the first to comment

  • Be the first to like this

A quality model for actionable analytics in rapids software development

  1. 1. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 732253. A Quality Model for Actionable Analytics in Rapid Software Development Silverio Martínez-Fernández, Andreas Jedlitschka, Liliana Guzmán, Anna Maria Vollmer Anna-Maria.Vollmer@iese.fraunhofer.de
  2. 2. Q-Rapids vision: Supporting rapid SW development 2 Quality requirements proposed through data mining and intelligent analysis of data
  3. 3. Quality: “the totality of characteristics of an entity that bear on its ability to satisfy stated and implied needs” (ISO 8402: Quality management and quality assurance –Vocabulary) Quality makes difference o Efficiency of IT and process development are an enabler for quickly reacting to market trends o Quality is the critical success factor for resulting products Once spoiled, quality is hard and expensive to fix Process Quality Model o “Definition and operationalization of product or process quality” o Typical approach: Refinement of “quality” down to metrics o Quality depends on application domain, stakeholders, and usage purpose Why Quality? 3 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018
  4. 4. Data gathering Data analysis Activities: Overview A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 4 Software quality workshops (On-site) User stories and quality models per use case Consolidated quality modelStudy of quality problems Landscape of data sources Interviews by two researchers Validation by the industry partners Architecture specification [Cluster] connect-jira connect-jenkins connect-svn Subversion connect-sonarq connect-redmine connect- elastic Dashboard qr-eval qr-connect
  5. 5. A quality model is an specification of the quality construct having a direct integration of the definition of quality requirements and measurement: From abstract strategic indicators to quantitatively and objectively defined assessed metrics. What is quality? How can we measure it? In the proof-of-concept we have a quality model composed of: Consolidated Q-Rapids Quality Model A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 5 Strategic Indicator Product Factor Assessed Metric Raw Data Data Source
  6. 6. A strategic indicator is: An aspect that a company considers relevant for the decision-making process. In the proof-of-concept we have two strategic indicators: Product Quality: It refers to how a product meets code quality, testing status, and software stability. Blocking: It informs that some condition has been detected influencing negatively in the progress of the regular workflow. Strategic indicators are composed of product factors. Strategic Indicators 6 Strategic Indicator Product Factor Assessed Metric Raw Data Data Source
  7. 7. Product factors are attributes of parts of the product. They need to be concrete enough to be measured. In the proof-of-concept we have several product factors. Examples about product quality: Code Quality: It measures the impact of code changes in source code quality. Testing Status: It measures the quality and stability level of executed tests during development. Software Stability: It measures the most critical issues. Product factors are composed of assessed metrics. Product Factors 7 Strategic Indicator Product Factor Assessed Metric Raw Data Data Source
  8. 8. An assessed metric is a concrete description of how a specific product factor should be quantified for a specific context. It shows the experts preferences regarding metrics values. The way to express preferences it is “utility functions”, which map assessments into a [0..1] range. In the proof-of-concept we have several metrics. Examples about code quality: Non-complex files: Assessment of files based on cyclomatic complexity. Commented files: Assessment of files based on the density of comments. Absence of duplications: Assessment of files based on duplicated lines. Assessed metrics are calculated from data coming from data sources, e.g. Non − complex files = Non−Complex files Total number of files Assessed Metrics 8 Strategic Indicator Product Factor Assessed Metric Raw Data Data Source
  9. 9. Raw data are the data as it comes from the different data sources (without any modification). Typically it cannot be broken down into simpler or more granular forms of data. Examples of raw data and their corresponding data sources are: Raw Data from Data Sources 9 Raw data Data sources Examples Source code analysis measures SonarQube, CodeSonar, Coverity Cyclomatic complexity Test-related measures SonarQube, Jenkins, JaCoCo plugin Test duration Issue-related data JIRA, GitLab, Redmine, Mantis Priority of an issue Sprint-related data JIRA, Redmine Duration of a sprint Commit-related data git, SVN Number of commits in a period Code review-related data Gerrit Number of reviewers Strategic Indicator Product Factor Assessed Metric Raw Data Data Source
  10. 10. Consolidated Q-Rapids Quality Model 10 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 Creation (and evolution) of a quality model for actionable analytics Business-goal-oriented  the selection of relevant and business-oriented metrics Aggregation of heterogeneous data sources into product factors  transparency Automatic interpretation of raw data (i.e., assessment) Product Quality Code Quality Commented files Comments lines, lines of code,… SonarQube Non-complex files File cyclomatic complexity, No. functions,… SonarQube Absence of duplications Duplicated lines, lines of code,… SonarQube Testing Status Passed tests Unit tests errors, Unit tests passed, … Jenkins, GitLab Software stability Ratio of open/in progress bugs No. of open bugs, No. of open issues,… JIRA, Redmine, GitLab Blocking Blocking code Fulfillment of critical/blocker quality rules Critical issues, blocking issues,… SonarQube Issues’ specification Issues completely specified Filled description in a issue, Filled due date,… JIRA, Redmine, GitLab Test performance Fast tests’ builds Test duration,… Jenkins
  11. 11. ABottom-UpApproach Quality Model Assessment: How does it Work? 11 U(F2) = 0.4 Product Quality Testing StatusCode Quality AM2: Non-complex files AM1: Commented files w = 0.33 w = 0.33 Static Sw. Code Analysis from SonarQube U(F1) = 0.5 Value = 0.26 Sw. Stability M3: Cyclomatic complexity of a file M2: Total number of files Absence of duplications w = 0.33 10.0 1.0 0.0 Utility(F2) 0.0 20.0 veto 0.4 6.0 Cyclomatic Complexity 1.0 0.0 1.0 Utility(F1) Comments Density 0.0 3.02.0 0.5 M4: Number of functions of a file M1: Density of comments of a file 𝐴𝐴𝐴𝐴𝐴 = ∑𝑖𝑖=1 𝑀𝑀𝑀 𝑈𝑈(𝐹𝐹𝐹 𝑀𝑀𝑀 ) 𝑀𝑀𝑀 𝐴𝐴𝐴𝐴𝐴 = ∑𝑖𝑖=1 𝑀𝑀𝑀 𝑈𝑈(𝐹𝐹𝐹 𝑀𝑀𝑀 𝑀𝑀𝑀 ) 𝑀𝑀𝑀 w = 0.33 w = 0.33 w = 0.33 Assessed Metric Strategic Indicator Data Source Product Factor Legend Raw Data
  12. 12. Implementation of the Q-Rapids Quality Model Q-Rapids Tool as data collection/analysis engine
  13. 13. qr-connect module  data gathering qr-eval module  implementation of quality model for data analysis Objects in Kibana  raw data visualization for actionable analytics Q-Rapids Tool modules 13 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 [Cluster] connect-jira connect-jenkins connect-sv n Subversion connect-sonarq connect-redmine connect- elastic Dashboard qr-eval qr-connect Insights, Actions
  14. 14. Modules: qr-connect for data gathering 14 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 [Cluster] connect-jira connect-jenkins connect-sv n Subversion connect-sonarq connect-redmine connect- elastic Dashboard qr-eval qr-connect Support for different data sources: nine connectors Heterogeneous data: providing valuable information about the process, system, usage Scalability: able to ingest huge amount of data per second (e.g., usage) >19 millions data points from use cases Initial infrastructure for Big data  attractive for companies to adopt Customizations: tutorial for modular “connectors”
  15. 15. Modules: qr-eval for data analysis 15 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 Providing valuable information to be shown in the dashboard Assessment using the preferences and judgments of experts and/or learned data Evaluation results from the use cases: Understandable metrics and factors Relevant for identifying deficiencies [Cluster] connect-jira connect-jenkins connect-sv n Subversion connect-sonarq connect-redmine connect- elastic Dashboard qr-eval qr-connect
  16. 16. Modules: raw data visualizations 16 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 Objects in Kibana portable while maintaining defined indexes generated by qr-connect It helps decision making. Examples: Prioritizing critical open bugs Solving quality rule violations [Cluster] connect-jira connect-jenkins connect-sv n Subversion connect-sonarq connect-redmine connect- elastic Dashboard qr-eval qr-connect
  17. 17. Proof of Concept Evaluation January 2018
  18. 18. 4 evaluations (1 evaluation of the PoC per industry partner) Object of Studies and Goals Sample o Quality model and tool: 8 participants incl. product owners, project managers and developers PoC Evaluation: Design Overview Object of Study Scope Evaluation Goal: Perception on Q-Rapids quality model • Product quality • Blocking situation Usefulness, relevance, right level of detail, understandability Q-Rapids tool • Identifying strengths and drawbacks in product quality • Identifying blocking situation Relevance, reliability (and further usability aspects) A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 18
  19. 19. Proof-of- Concept Q-Rapids Introduction Training Tasks Feedback & Closing PoC Evaluation: Design Overview Overview Informed consent Presentation Finding product strengths/ drawbacks, blocking situation. process model deficiencies. Observation Questionnaire Open feedback Demographics Design Controlled environment 19 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018
  20. 20. Participants‘ perception on … N Mdn (Min – Max) Qualitymodel Usefulness 7 4 (4 – 4) Understandability: Metrics 7 4 (2 – 5) Understandability: Factors 7 3 (2 – 5) Relevance 7 4 (3 – 5) Tool Relevance 8 4 (3 – 4) Reliable 8 4 (3 – 4) PoC Evaluation: Main Results The quality model is considered as useful and relevant. The results indicate that the model is understandable yet it differs among the containing elements. The tool is considered as relevant and reliable. A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 20 N: Number of participants who rated the question *: Five-point response scale from 1: strongly disagree to 5: strongly agree, and the option “I don’t know”
  21. 21. PoC Evaluation: Main Results Example of needs for improvements N Add definitions of metrics, quality factors and strategic indicators (not self-explanatory) 7 Include support for determining thresholds value 4 Add link to raw data for supporting the decision making process 4 Adding information for properly interpreting visual clues, e.g., exact values (i.e., values before normalization) 4 Adding description of the quality model incl. dependencies among their elements (e.g., as tree map) 2 The identified suggestions for improvements were discussed and prioritized with the scientific and industry partners The evaluation results are contributing to improve Q-Rapids components A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 21
  22. 22. Results can be interpreted only as indications/expectations, not final conclusions Diverse, but small sample Limited scope of tasks assigned to the participants and short-time to solve them o Q-Rapids tool: Analysis, not decision making Several co-founding factors, e.g., o Difficulties in understanding the questionnaires o Non-announced attendees (outside the target population) PoC Evaluation: Main Limitations A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018 22
  23. 23. Data gathering: ingesting more data both at runtime and development, e.g. Logs  usage, errors Iterations  velocity Data analysis: Generation of alerts based on the quality model (qr-alert module) To generatequality requirements Transversal (both): Improvements on the modules based on evaluations Exploiting the results in Fraunhofer IESE Performing further evaluations Next steps 23 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018
  24. 24. Questions 24 A Quality Model for Actionable Analytics in Rapid Software Development Vollmer, Euromicro SEAA - 30th August 2018

×