With changes in software development methodologies, the role of the data modeler has changed significantly. In many organizations, data modelers now find themselves on the outside looking in, relegated to documentation "after the fact" rather than active participation where the true value is added. In order to participate fully, modelers must not only adapt to an Agile work style, but must also be able to communicate the business value of model driven development.
This session is based on a real case study in which data modeling was introduced part-way through a significant software development project that was quickly losing momentum due to high defect levels. Ron Huizenga will show the contrast in metrics and cost when utilizing skilled data modelers versus a development-only approach, with topics including:
Modeler participation in multiple Agile teams
Defect categories and impact
Measurement and analysis techniques
Remediation strategy
Breakthrough quality improvements
This "must see" session is not only for data modelers and architects, but also the decision makers for these initiatives, with information that is vital to modelers, IT executives and business sponsors. So bring your boss to the session!
.
Talk a bit about my manufacturing background, quality movement, six sigma. Business transformation initiatives.
When discussing benefits, talk about productivity improvements and consistencies that are driven from a data modeling tool like ER/Studio.
Extreme can sometimes be characterized as “anti-establishment”
Use aircrew example for self organizing teams. Pose the question: how would it be if flight attendants and pilots decided to swap jobs randomly between or during flights? What about DR’s, nurses, technicians in an operating room? Does anyone on the webinar want to volunteer to be a passenger in scenario 1, or patient in scenario 2.
Projected extension based on calculated burndown
Not trying to tell all of you to become black belts. However, an objective framework that objectively measures results is required.
Explain Six Sigma DMAIC
Based upon the above, it appears that the Database & Persistence defects are the most important category to address, but only by a slight margin. However, the relative impact of fixing this type of error should also be considered.
Wait a minute – not all defects have the same impact. How do we quantify that?
This shows the weighting assigned to each category. We have extended our original chart to show this.
When the weighting is applied, it is easy to see that the relative impact of the Database and Persistence defects is larger than the other 3 categories combined. Therefore, the focus of this project will be to address and minimize the database and persistence errors to as low a level as possible.
May wish to remove slide to reduce complexity
It is also important to understand the distribution of the defects across the 20 week measurement period. Each defect is based upon a table object or column object. Therefore, the object counts are a measure of each table and column created in the given week. As can be seen from the time series graph, the number of objects created in a given week is quite variable. This is because the database and persistence is only 1 area of development, as highlighted earlier. Once the persistence is mapped to the database objects, other types of programming and development occur. Thus, the number of defects was variable as well, but appeared to track at a level the same or slightly higher than the number of objects.
In order to examine this more closely, the ratio of defects to objects was calculated for each week, as well as the point value of the defects, since this is the true measure of the effort required to correct the defects. If we look at those results below, it could lead to an assumption that every single object created had 1 or more defects. However, that is not necessarily the case. It does, however, indicate a major quality problem in general.
Based upon results so far, it is evident that the development effort is delivering very poor quantity. However, we should also evaluate the defects against the possible occurrences of the defects (defect opportunities). This shows an improved result, with the defect level at 1077 out of 4090 opportunities (26.33%) for the 20 week period.
It is obvious that it is necessary to analyze the cause of the defects that are occurring in order to get all the teams at an acceptable level of performance, or the delivery of the software solution will be in jeopardy
The previous graphs show a lot of variability from week to week. Therefore, to see if there was any type of trend, an analysis of cumulative objects vs. defects was conducted in order to smooth the variability across the weeks. This clearly shows that the number of defects produced is at a higher level than the actual number of objects, but almost parallel.
In order to properly evaluate the effectiveness of the new process, data sampling was continued from weeks 21 through 31 using the same measurement criteria. This section shows the results for the new process only.
As can be seen below, a low number of objects were created in week 21, which was the first week in which the data architect was correcting the previous database and persistence defects in conjunction with the development teams. More significantly, it is the first of several times that there were 0 defects logged in a week. The lowest number in any week recorded previously was 28 (week 15). There is also a large spike in objects created during week 26, with 434 new objects but only 10 defects. Within the 11 week control period, 1083 objects were created. The previous 20 weeks produced only 957 objects. Again, the real significance lies in the defects: only 38 in the control period compared to 1077 in the first 20 week. This shows not only a tremendous improvement in productivity, but also a quantum leap in quality.
Because of the different time period an values, the range of the scale is vastly different. To get a better indication of the impact of remediation efforts, we need to plot as a continuation of the previous timeline.
Our target was to reduce the defect rate by 75%. We achieved an astounding 1972% when looking at the comparison of defect points/opportunity. We also gained a development productivity increase of 205%, which will allow the development team time to address the defect categories that were not targeted by this particular quality improvement initiative.
Talk about manufacturing analogy for quality up front rather than “inspection”