Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Data-driven Requirements Engineering in Agile Projects

15 views

Published on

The Q-Rapids approach on data-driven RE was presented in JiTRE 2017 workshop.

Published in: Software
  • Be the first to comment

  • Be the first to like this

Data-driven Requirements Engineering in Agile Projects

  1. 1. Funded by the  European Union Data‐driven Requirements  Engineering in Agile Projects:  The Q‐Rapids Approach Xavier Franch1, Claudia Ayala1, Lidia López1, Silverio Martínez‐Fernández2, Pilar  Rodríguez3, Cristina Gómez1, Andreas Jedlitschka2, Markku Oivo3, Jari Partanen4,  Timo Räty4, Veikko Rytivaara4 1UPC (Spain), 2Fraunhofer IESE (Germany), 3U. Oulu (Finland), 4Bittium (Finland)
  2. 2. Motivation 2 Use of available data as a way to implement just‐in‐time requirements  engineering Q‐Rapids project: an EU H2020 Research and Innovation Action using this  vision “Q”: emphasis to be made on quality requirements
  3. 3. The Q‐Rapids approach 3
  4. 4. Example: blocking 4 Identification of blocking situations while developing a feature or user story Related to waiting time o Against the lean principle of “deliver fast” Some reasons reported in literature: ocf. McConnell and Goldratt, Sedano and Ralph (2017) The identification of blocking situations can be used to evaluate software  quality and identify quality requirements o E.g. about testability
  5. 5. Factors affecting blocking in Q‐Rapids 5 Factor Metric Data source Feature definition  completion • Number of features incomplete in backlog • Average time to complete feature definition • Issue tracking system Delayed tasks • Number of blocked tasks • Number of blocking tasks • Waiting time to finish blocking tasks • Continuous integration tool Test failing • Number of tests failed • Test coverage • Number of omitted/not‐run tests • Continuous integration tool Tests performance • Time to execute tests • Prediction estimate of development until  next release based on test status • Continuous integration tool Low quality features • Time to solve quality rule violations on the  feature • Static code analysis tool
  6. 6. The Strategic Dashboard: concept 6 Aggregation of factors into a single strategic indicator oUse of Bayesian networks combining real data and experts’ opinion Drill‐down capabilities Evaluation of decisions (impact, value, effort, risk, ...) Prediction rules to detect upcoming violations What‐if analysis Mitigation strategies
  7. 7. The Strategic Dashboard: some views 7
  8. 8. The Strategic Dashboard: some views 8
  9. 9. The Process 9 Implementing a continuous end‐to‐end flow of features in organizations: ocf. authors’ QUASD@PROFES’17 paper on applying method engineering E.g., optimal management of features by real‐time identification and  understanding of “blocking” features oInformation provided by the dashboard at different organizational levels (business  owners, product owners, developers, testers, ...) oCatalogue of possible actions: include/drop items in/from backlogs; re‐prioritization;  stop the line (until solving the blocking situation) oFit to agile method of the organization, e.g.: • Kanban: input to Kanban board • Scrum: use of the dashboard in prioritization oIn any case, gain of transparency
  10. 10. Research agenda 10 Development of models for strategic indicators o e.g. time‐to‐market, business value, … Connection to requirement management (specially quality requirements) Infrastructure for continuous quality monitoring ...
  11. 11. Funded by the  European Union Data‐driven Requirements  Engineering in Agile Projects:  The Q‐Rapids Approach Xavier Franch1, Claudia Ayala1, Lidia López1, Silverio Martínez‐Fernández2, Pilar  Rodríguez3, Cristina Gómez1, Andreas Jedlitschka2, Markku Oivo3, Jari Partanen4,  Timo Räty4, Veikko Rytivaara4 1UPC (Spain), 2Fraunhofer IESE (Germany), 3U. Oulu (Finland), 4Bittium (Finland)

×