The Software Resource Data Report provides DoD cost analysts with a primary source of comprehensive, objective, and model-independent sizing, effort and schedule data on large DoD software programs. The SRDR does not rely on rigid, standardized data fields for collecting software effort and sizing information (a feature advocated by Government, industry, and academia during the design of the SRDR). This flexibility lessens the reporting burden placed on industry and reduces ‘data synthesis’ errors. Consequently, this flexibility creates the need for developing and using sound techniques for analyzing and interpreting, and ultimately normalizing data from multiple DoD weapon system contractors, each whom use different terms and definitions for software sizing and effort.
This need for sound techniques to normalize software data from multiple developers will become more critical as the amount of data available to DoD analysts from SRDRs grows over the next decade. Analysts must properly analyze, interpret, and, ultimately, normalize the data before drawing any ‘industry-average’ conclusions such as average productivity.
This presentation updates our investigation for the Deputy Assistant Secretary of the Army- Cost and Economics on the methods and techniques for building a normalized software database. We used SRDRs, supplemented with data separately collected for the Army. We believe these techniques discussed in this presentation can serve as a basis of best practices for analyzing SRDR data (and other data collected separately) in the future. Our presentation discusses (1) the analytical approach, (2) problems we encountered with the data (3) results of our analysis (4) a summarized and sanitized view of the normalized data and (5) recommendations for building a sound software industry-average database.
My talk at PMI Sweden Congress 2013 on Agile and Large Software ProductsSvante Lidman
This is my "Success Factors for Agile Development of Very Large Software Products" as it was presented at the PMI Sweden Congress on March 11 2013. The title of the presentation is in Swedish but the material is almost completely in English.
Recover 30% of your day with IBM Development Tools (Smarter Mainframe Develop...Susan Yoskin
If you need to attract new developers, and want to keep your company’s name out of the headlines, then this session is for you. When your business depends on your mainframe apps working and performing well—all the time—you need to be alerted to issues as they occur and have the tools to help you find and fix the problems and test your solutions before disaster strikes (we’ve all been in those late night and weekend drills). You also need to continue supporting these applications for years to come, and that will require new talent.
This session will introduce you to the development environments that college grads are already comfortable with, and help your applications become more resilient at the same time. We’ll walk you through the tools to help you accomplish all of this and demo some scenarios to show you how efficiently our tools can perform the tasks that slow you down.
What's new in RAD and RSA 8.5? Attend this session and learn about the top new features of RSA (Rational Software Architect) and RAD (Rational Application Developer) that can save you time and money. In RSA we will be discussing how to improve collaboration and reuse with design manager, as well as how to accelerate spring and hibernate development. In RAD we will be looking at the development support for the new Liberty profile, and how that will dramatically reduce development times for Websphere Application Server development, as well as the new Rich Page Editor for simplifying and accelerating the development of Web2.0 applications.
The Good Design is Good Business community is excited to host Steve Arnold, Rational Client Technical Specialist. Steve is the Architecture, Design, Construction (ADC) Leader in the UK, with an established presence on developerWorks.
My talk at PMI Sweden Congress 2013 on Agile and Large Software ProductsSvante Lidman
This is my "Success Factors for Agile Development of Very Large Software Products" as it was presented at the PMI Sweden Congress on March 11 2013. The title of the presentation is in Swedish but the material is almost completely in English.
Recover 30% of your day with IBM Development Tools (Smarter Mainframe Develop...Susan Yoskin
If you need to attract new developers, and want to keep your company’s name out of the headlines, then this session is for you. When your business depends on your mainframe apps working and performing well—all the time—you need to be alerted to issues as they occur and have the tools to help you find and fix the problems and test your solutions before disaster strikes (we’ve all been in those late night and weekend drills). You also need to continue supporting these applications for years to come, and that will require new talent.
This session will introduce you to the development environments that college grads are already comfortable with, and help your applications become more resilient at the same time. We’ll walk you through the tools to help you accomplish all of this and demo some scenarios to show you how efficiently our tools can perform the tasks that slow you down.
What's new in RAD and RSA 8.5? Attend this session and learn about the top new features of RSA (Rational Software Architect) and RAD (Rational Application Developer) that can save you time and money. In RSA we will be discussing how to improve collaboration and reuse with design manager, as well as how to accelerate spring and hibernate development. In RAD we will be looking at the development support for the new Liberty profile, and how that will dramatically reduce development times for Websphere Application Server development, as well as the new Rich Page Editor for simplifying and accelerating the development of Web2.0 applications.
The Good Design is Good Business community is excited to host Steve Arnold, Rational Client Technical Specialist. Steve is the Architecture, Design, Construction (ADC) Leader in the UK, with an established presence on developerWorks.
Continuous Integration to Shift Left Testing Across the Enterprise StackDevOps.com
With the move to agile DevOps, automated testing is a critical function to ensure high quality in continuous deployments.
In this session, learn how to start testing earlier and often to ensure quality in your codebase. Join Architect Suman Gopinath and Offering Manager Korinne Alpers to talk about shifting-left in the development cycle, starting with unit testing as a key aspect of continuous integration. You'll view a demo of the latest zUnit unit testing tooling for CICS Db2 applications, as well as hear best practices and tales from the testing trenches.
A Foundation for Success in the Information EconomyInside Analysis
The Briefing Room with Dr. Robin Bloor and Hewlett-Packard
Live Webcast on Oct. 30, 2012
Success in today's information economy rises and falls on the efficiency of data management. Companies that treat their information assets as mission-critical components of the business will find ways to better their competitors. The key is to ensure that the foundation of your information architecture can satisfy the wide range of user demands. Moreover, the ability to scale quickly and efficiently has become paramount.
Check out this episode of The Briefing Room to hear veteran Analyst Robin Bloor who will explain the benefits of embracing a modern Information Oriented Architecture (IOA). He'll also tout the purpose of using a flexible SQL engine in this era of NoSQL technologies. He will be briefed by Ajaya Gummadi of Hewlett-Packard, who will show how her company’s NonStop SQL has evolved to become a valuable solution for mission-critical data, mixed workloads and high volume databases. She will also explain how their integrated hardware and software stack can help reduce the cost of operations and management in a large-scale database environment.
Continuous Integration to Shift Left Testing Across the Enterprise StackDevOps.com
With the move to agile DevOps, automated testing is a critical function to ensure high quality in continuous deployments.
In this session, learn how to start testing earlier and often to ensure quality in your codebase. Join Architect Suman Gopinath and Offering Manager Korinne Alpers to talk about shifting-left in the development cycle, starting with unit testing as a key aspect of continuous integration. You'll view a demo of the latest zUnit unit testing tooling for CICS Db2 applications, as well as hear best practices and tales from the testing trenches.
A Foundation for Success in the Information EconomyInside Analysis
The Briefing Room with Dr. Robin Bloor and Hewlett-Packard
Live Webcast on Oct. 30, 2012
Success in today's information economy rises and falls on the efficiency of data management. Companies that treat their information assets as mission-critical components of the business will find ways to better their competitors. The key is to ensure that the foundation of your information architecture can satisfy the wide range of user demands. Moreover, the ability to scale quickly and efficiently has become paramount.
Check out this episode of The Briefing Room to hear veteran Analyst Robin Bloor who will explain the benefits of embracing a modern Information Oriented Architecture (IOA). He'll also tout the purpose of using a flexible SQL engine in this era of NoSQL technologies. He will be briefed by Ajaya Gummadi of Hewlett-Packard, who will show how her company’s NonStop SQL has evolved to become a valuable solution for mission-critical data, mixed workloads and high volume databases. She will also explain how their integrated hardware and software stack can help reduce the cost of operations and management in a large-scale database environment.
Agile methods and safety critical software - Peter GardnerAdaCore
This talk surveys Agile methods and formulates a list of features that occur in these methods, then considers whether each of the features can be applied in the field of safety-critical software development. The talk concludes that almost all of the features of Agile methods are applicable to safety-critical software but that existing standards are a problem for Agiles de-emphasis of design and documentation. The talk will also look for quantitative evidence in the published literature for the benefits of Agile methods in software development in general, and surveys various published opinions on Agiles application to safety-critical software development.
Software functional testing can unveil a wide range of potential malfunctions in applications. However, there is a significant fraction of errors that will be hardly detected through a traditional testing process. Problems such as memory corruptions, memory leaks, performance bottlenecks, low-level system call failures and I/O errors might not surface any symptoms in a tester’s machine while causing disasters in production. On the other hand, many handy tools have been emerging in all popular platforms allowing a tester or an analyst to monitor the behavior of an application with respect to these dark areas in order to identify potential fatal problems that would go unnoticed otherwise. Unfortunately, these tools are not yet in widespread use due to few reasons. First, the usage of tools requires a certain amount of expertise on system internals. Furthermore, these monitoring tools generate a vast amount of data even with elegant filtering and thereby demand a significant amount of time for an analysis even from experts. As the end result, using monitoring tools to improve software quality becomes a costly operation. Another facet of this problem is the lack of infrastructure to automate recurring analysis patterns.
This paper describes the current state of an ongoing research in developing a framework that automates a significant part of the process of monitoring various quality aspects of a software application with the utilization of tools and deriving conclusions based on results. According to our knowledge this is the first framework to do this. It formulates infrastructure for analysts to extract relevant data from monitoring tool logs, process those data, make inferences and present analysis results to a wide range of stakeholders in a project.
An Integrated Framework for Parameter-based Optimization of Scientific Workflowsvijayskumar
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not affect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
Wikidsmart PM: Requirements Management within Confluence, Integrated with JIRAzAgile
Defining requirements and tracking them through the software delivery process is a constant challenge. Challenges include: tracking requirement completion, quality (bugs and test execution against requirements), and traceability (changes across releases). zAgile's Wikidsmart PM offers an open source and pragmatic approach to Requirements Management by enabling Atlassian's Confluence wiki and JIRA issue tracker to manage Requirements in a powerful, flexible way, while addressing all of the challenges mentioned earlier. Wikidsmart is an open source platform which enables integration of software engineering tools as well as business applications like CRM. In addition, it enables composite applications - such as Wikidsmart PM for Requirements Management - to be manifested within your existing tools and applications.
Initial Results Building a Normalized Software Database Using SRDRs
1. Initial Results Building a Normalized
Software Database using SRDRS
2009 ISPA/SCEA
Professional Development and Training Workshop
St. Louis, MO
2 – 5 June 2009
Michael Gallo
Paul Hardin
Elizabeth Koza
Robert Bailey
Sponsor: Deputy Assistant Secretary of the Army for Cost & Economics
p p y y y
2. Background
• The Army is collecting software data Army s
Army’s Desired Uses of the Data
to build a database
• Productivity factors
• Two primary collection sources
• Parametric estimating equations
– SRDRs
– Army internal collection • Calibration of commercial
• Current research is focused on software cost models
weapon system software, not AIS
p y • Sizing estimates
programs • Visualizing trends
• Research approach • Sanity checks
– Minimize contractor’s effort to
report the data (i.e. maximize use of
artifacts used internally by the
contractors)
– Avoid the use of subjective data fields
j
Slide 2
Presented at 2009 ISPA/SCEA Workshop
3. Selected SRDR Data Element Summary
Section 1 - General Context Section II – Product Description
• System/Element Name • Functional Description
• Report as-of date
p • Software Development
Characterization
Ch t i ti
• Authorizing Vehicle • Application Type
• Development Organization – Primary and Secondary Programming
language
• Software Process Maturity – Percentage of Overall Product Size
• Precedents – Development Process
– Upgrade or New Development?
• SRDR Data Dictionary – SW Development Method
Filename
Fil • Non-Developmental Software
• Comments – COTS/GOTS Applications Used
– Integration Effort (Optional)
• Staffing
– Peak Staff
– Peak Staff Date
– Hours per Staff-Month
• Personnel Experience by Domain
• Comments
Slide 3
Presented at 2009 ISPA/SCEA Workshop
4. Selected SRDR Data Element Summary
Section III – Product Size • Effort (staff-hours)
• Requirements Counts • Effort must be partitioned into a
– Total Software Requirements set of activities
– N Software R i
New S f Requirements • For each SW activity reported
the contractor must provide:
– Total External Interface – WBS Element reference
Requirements – Start Month
– New External Interface – End Month
d o t
Requirements
R i – Prime Contractor hours
– Requirements Volatility – All Other Sub-ctr hours
• Total Delivered Code Count
– New Code
– Reused With Modifications
– Reused Without Modifications
– Carryover Code
– Auto-generated C d
A d Code
– Sub-contractor Code
– Counting Convention
• Comments
+ SRDR Data Dictionary
Slide 4
Presented at 2009 ISPA/SCEA Workshop
5. The Problem
• Data is collected from a variety of contractors and
product mission areas and typically used and
reflected as ‘industry averages’
• Contractors use different definitions for reporting
size, effort, and schedule data
ff
– There is no universal standard or mandate for software
accounting and metric data
– Data sources such as the SRDR permit contractors to
tailor the report to the contractor’s internal accounting
and metrics systems
• Before the data can be used across
projects/developers it must be made comparable
j /d l i b d bl
Slide 5
Presented at 2009 ISPA/SCEA Workshop
6. Approach For Normalizing SW Data
1. Review d quality
1 R data l
2. Identify both prevalent sizing metric in the data and
prevalent sizing categories used
3. Identify prevalent software activities included in reported
‘chunks’ of effort, especially activities that are reported
discretely.
4. Formulate a series of estimating equations that estimate
constituent activities buried within reported ‘chunks’;
chunks ;
Review context for important variables that can be used to
explain differences in the data; derive coefficients
5. Apply estimating equations relatively to estimate missing
pieces and to break apart chunks into discrete activities
Slide 6
Presented at 2009 ISPA/SCEA Workshop
7. Review Data Quality
• Understand the SW product
• Raw data points were • Mission, function and complexity of the software
filtered to remove data •
•
Platform and operating environment of the software
Understand what programming languages were used
deemed inadequate • Understand the development project
U d d h d l
– Characterization of the development work
– Missing or incomprehensible – Understand how the software product is put
definitions together
• How it’s integrated
g
– No size or effort reported • How much was built with reused components
• How much was auto-generated
– No language reported • Understand who developed it
– No counting convention – Primes
– Subs
S b
reported • Understand what’s in the data reported
– Reported only Total SLOC – Scope of effort reported (what’s
included/excluded)
– Foreign Military Sales (FMS)
g y ( ) – Understand the units of sizing and rules for sizing
g g
categories
• Many of the data points • Understand other attributes that might drive
cost/schedule/quality
removed came from a
3rd party
Slide 7
Presented at 2009 ISPA/SCEA Workshop
8. Identify Prevalent Sizing Metric and Categories
• What are the basic sizing units of 200,000
measure in the dataset? 180,000
New + .00 * Modified + .00 * Unmodified (i.e. New SLOC)
New + 1.00 * Modified + .10 * Unmodified
– What is the prevalent counting 160,000
160 000
New + 1.00 * Modified + 1.00 * Unmodified (i.e. Total SLOC)
convention? Total SW Development
Hours = 4E-30*ESLOC
5.9339
140,000
– What are the programming
velopment Hours
120,000
languages used?
Total SW Development
0.496
Hours = 216.5*ESLOC
100,000 Total SW Development
1.1955
• What sizing categories are used?
Total SW Dev
Hours = 0 0307*ESLOC
H 0.0307*ESLOC
80,000
– Varying definitions are used 60,000
– Additionally, Auto-Generated, 40,000
Carryover, Deleted, COTS 20,000
• ESLOC weights have a significant 0
0 100 200 300 400 500 600
effect on the normalized data and
KESLOC
will ultimately influence Each represents one project, but with a
different computation of that project’s ESLOC.
– Computed productivity
(ESLOC/hr)
– Observed effort vs. Size trend
Slide 8
Presented at 2009 ISPA/SCEA Workshop
9. Identify Prevalent SW Activities in the Data
1 Detailed Design + Coding + Unit Testing Sys Req + Req Def + Integration + FQT SW PM + SQE + CM Meetings
2 DIT CTS CPTO LIM MIS CFIN REQ
Q SBVT Demo Q
QA CM
SW Requirements Analysis SW Coding and Unit Testing SW Developmental T&E Other
Systems Engineering Software Engineering Integration & Test Database Mgmt & Direct Spt Functions Subcontractor Effort
3
SW Requirements SW Coding and Unit SW Integration and System Integration
System Eng Requirements Analysis and Arch SW Design Management, Support, and Labs
Analysis and Arch Testing Test and Test
SW Architecture and SW Coding and Unit Systems and System
4 SW Requirements Analysis Independent Test Group Metrics, SCM, Documentation & Other Dept Support Efforts
Detailed Design Testing Test
5 Requirements Design, Code, Test System Integration SW CM SW Qual SW Mgmt
6 Ada (Staff-months) Ada83 (Staff-months) Jovial (Staff-months) Jovial/ASM (Staff-months)
SW Requirements Preliminary Design Detailed Design Code & Unit Test Mega Level Test (MLT) Element Level Test (ELT)
7
Detailed Design + Code & Unit Test + Mega Level Test + Element Level Test + PIV Defects
8 Design and Document Review Interface Design Documents Coding Code Reviews Integration and Test Other Documentation
SW Architecture and SW Coding and Unit Software Qualification
SW Requirements Analysis SW Integration and System/SW Integration SW Developmental T&E Other
Detailed Design Testing Testing
9 Detailed Design Iterations, SW/SW Integration SW/HW Integration Support to Systems, SDP,
Requirements Analysis High Level Design, PDR Code & Unit Test SCM, Sys Admin, VDD SQE
CDR & OCSD Testing Testing Test, ER & ILS Management
System Requirements Definition Implementation SW/HW Test System Integration System Verification CM, QA,
CM QA PM
Source
SW Coding and Unit SW Architecture and SW Int. and System/SW SW Qualification
10 SW Requirements Analysis SW Development Test and Evaluation CM, SW Safety, SW Process Improvement
Testing Detailed Design Int. Testing
SW Coding and Unit SW Architecture and SW Int. and System/SW
11 SW Requirements Analysis SW Qualification Testing Mgmt, SW CM, SW Process, System Admin Spt, Subcontract Mgmt Spt
Testing Detailed Design Int.
SW Coding and Unit
12 SW Requirements Analysis SW Architecture and Detailed Design SW Int. and System/SW Int. SW Qualification Testing
Testing
13 Systems Engineering Activities SW Engineering Activities SW Integration Activities
SW Requirements Analysis
q y SW Design/Code/Test & Integration
g g SW Quality Assurance
Q y SW Configuration Mgmt
g g SW Environment Support
pp
SW Requirements Analysis SW Design/Code Test and Integration Other mgmt and spt (build related)
SW Test and
14 SW Requirements Analysis SW Design/Code Help Desk and User Spt Other mgmt and spt (build related)
Integration
SW Coding and Unit SW Integration and System/Software
SW Eng Requirements Analysis SW Architecture and Detailed Design SW Development Test and Evaluation CM, SW QA, and Dev Environment
Testing Integration
SW Coding and Unit SW Architecture and
15 SW Requirements Analysis SW Integration and System/SW Integration SW DT&E SW Environmental Dev: SW Mgmt, CM, SW QM, Environmental Dev
Testing Detailed Design
SW Architecture and SW Integration and
16 SW Requirements Analysis SW Coding SW Unit Testing SW Qualification Testing Build Specific Systems Development
Detailed Design Test
17 SW Requirements Analysis SW Design SW Implementation and Unit Testing SW Verification QA, SCM, SW Environment
18 SW Requirements Analysis SW Architecture and Detailed Design SW Coding and Unit Testing SW Integration and System/SW Integration
SW Architecture & Detailed SW Coding & Unit
19 SW Requirements Analysis SW Integration & System/SW Integration SW Developmental Test & Evaluation Spiral Level Leadership Other Direct Hours
Design Testing
20 SW Requirements + Preliminary Design + Detailed Design + Code & Unit Test + SW I&T
21 Requirements Analysis Model Development Design C&UT Integration & Test FQT PM Data QA DevEnv CM & QA
Each is a discretely reported ‘chunk’ of software activity.
Activities in blue font reflect data from an SRDR source
Slide 9
Presented at 2009 ISPA/SCEA Workshop
10. Identify SW Activities in the Data
1 Detailed Design + Coding + Unit Testing Sys Req + Req Def + Integration + FQT SW PM + SQE + CM Meetings
2 DIT CTS CPTO LIM MIS CFIN REQ SBVT Demo QA CM
SW Requirements Analysis SW Coding and Unit Testing SW Developmental T&E Other
Systems Engineering Software Engineering Integration & Test Database Mgmt & Direct Spt Functions Subcontractor Effort
3
SW Requirements SW Coding and Unit SW Integration and System Integration
System Eng Requirements Analysis and Arch SW Design Management, Support, and Labs
Analysis and Arch Testing Test and Test
SW Architecture and SW Coding and Unit Systems and System
4 SW Requirements Analysis Independent Test Group Metrics, SCM, Documentation & Other Dept Support Efforts
Detailed Design Testing Test
5 Requirements Design, Code, Test System Integration SW CM SW Qual SW Mgmt
6 Ada (Staff-months) Ada83 (Staff-months) Jovial (Staff-months) Jovial/ASM (Staff-months)
SW Requirements Preliminary Design Detailed Design Code & Unit Test Mega Level Test (MLT) Element Level Test (ELT)
7
Detailed Design + Code & Unit Test + Mega Level Test + Element Level Test + PIV Defects
8 Design and Document Review Interface Design Documents Coding Code Reviews Integration and Test Other Documentation
SW Architecture and SW Coding and Unit Software Qualification
SW Requirements Analysis SW Integration and System/SW Integration SW Developmental T&E Other
Detailed Design Testing Testing
9 Detailed Design Iterations, SW/SW Integration SW/HW Integration Support to Systems, SDP,
Requirements Analysis High Level Design, PDR Code & Unit Test SCM, Sys Admin, VDD SQE
CDR & OCSD Testing Testing Test, ER & ILS Management
System Requirements Definition Implementation SW/HW Test System Integration System Verification CM, QA, PM
Source
e
SW Coding and Unit SW Architecture and SW Int and System/SW SW Qualification
Int.
10 SW Requirements Analysis SW Development Test and Evaluation CM, SW Safety, SW Process Improvement
Testing Detailed Design Int. Testing
SW Coding and Unit SW Architecture and SW Int. and System/SW
11 SW Requirements Analysis SW Qualification Testing Mgmt, SW CM, SW Process, System Admin Spt, Subcontract Mgmt Spt
Testing Detailed Design Int.
SW Coding and Unit
12 SW Requirements Analysis SW Architecture and Detailed Design SW Int. and System/SW Int. SW Qualification Testing
Testing
13 Systems Engineering Activities SW Engineering Activities SW Integration Activities
SW Requirements Analysis SW Design/Code/Test & Integration SW Quality Assurance SW Configuration Mgmt SW Environment Support
SW Requirements Analysis SW Design/Code Test and Integration Other mgmt and spt (build related)
SW Test and
14 SW Requirements Analysis SW Design/Code Help Desk and User Spt Other mgmt and spt (build related)
Integration
SW Coding and Unit SW Integration and System/Software
SW Eng Requirements Analysis SW Architecture and Detailed Design SW Development Test and Evaluation CM, SW QA, and Dev Environment
Testing Integration
SW Coding and Unit SW Architecture and
15 SW Requirements Analysis SW Integration and System/SW Integration SW DT&E SW Environmental Dev: SW Mgmt, CM, SW QM, Environmental Dev
Testing Detailed Design
SW Architecture and SW Integration and
16 SW Requirements Analysis SW Coding SW Unit Testing SW Qualification Testing Build Specific Systems Development
Detailed Design Test
17 SW Requirements Analysis SW Design SW Implementation and Unit Testing SW Verification QA, SCM, SW Environment
18 SW Requirements Analysis SW Architecture and Detailed Design SW Coding and Unit Testing SW Integration and System/SW Integration
SW Architecture & Detailed
A hit t D t il d SW Coding & Unit
C di U it
19 SW Requirements Analysis SW Integration & System/SW Integration SW Developmental Test & Evaluation Spiral Level Leadership Other Direct Hours
Design Testing
20 SW Requirements + Preliminary Design + Detailed Design + Code & Unit Test + SW I&T
21 Requirements Analysis Model Development Design C&UT Integration & Test FQT PM Data QA DevEnv CM & QA
Blue font = SRDR Data Source
Slide 10
Presented at 2009 ISPA/SCEA Workshop
12. Some Approaches to Normalize Effort
• Filter to include projects with the same
set of activities
– Need many data points Changes in Distribution of Effort
– Or filter to maximize the # of activities
100%
(results in a small # of projects) 90% PM, CM, QA, Data
– Or Filter to maximize the # of projects
80%
(results in a small # of activities)
70%
SW and System
I&T
60%
• Use a factor to fill in missing activities
% of Total Effort
E
50%
– Can’t derive a factor in isolation to 40% Code & Unit Test
add/remove activities because: 30%
Design
– Factor can change as size increases 20%
– Need unique factors for each unique 10%
System and SW
Requirements
‘chunk’ 0%
10,000 50,000 100,000 500,000 1,000,000 5,000,000 10,000,000
• Derive non-linear equation in isolation to
Development Size (Equivalent New HOL)
add/remove activities Source: VERA
– Basically,
Basically filtering on activities reported discretely
(greens)
– Multiple projects must have reported activity
discretely
• Would result in dropping data that
contain actuals that are included in chunks
(yellows)
Slide 12
Presented at 2009 ISPA/SCEA Workshop
14. General Formulation of Approach
• Each ‘chunk’ consists of a set of estimated activities
chunk ET = Total Reported Project Effort
ECH =
ˆ (∑ E )
ˆ
i
ET = Total Estimated Project Effort
ˆ
E = Estimated Chunk Effort
ˆ
• Total estimated project effort consists of a set of estimated chunks
CH
Ei = Estimated effort of activity i
ˆ
ET =
ˆ (∑ E )
ˆ
CH
• Each effort estimating relationship (EER) provides an estimate for one activity
Ei = Ai [ki (ESLOCi )bi * (EAF )
ˆ * (EAF2 ) * (EAF3 ) * (EAF4 )
Platform Dev_Type CMM SystemEffe
ct
1 ]
where Challenges Encountered
• Not enough data points. 211
points
Ai = 1 if activityi is includedin total effort (ET ); else Ai = 0
unknowns, but only 168 d.o.f. in
ESLOCi = EquivalentNew Lines of Code for Activityi the data
Platform = 1 if platformis Air; else Platform = 0 • ‘Partial’ programs drove analysis
Developer Type = 1 if single developeror subcontrac else DeveloperT = 0
tor; ype to illogical results
CMM = CMM Levelof Developer- Avg CMM Levelof Database • Required significant tailoring
SystemEffe = 1 if systemis part of larger developmen project;else SystemEffe = 0
ct t ct – Reduced SLOC
categories
• ESLOC is specified uniquely for each activity – Common ESLOC
weights
m
ESLOC = New * CF + ∑ Not _ New * CF * w
i i j i ij
j =1 – CCommon D O S for
D.O.S. f
where CFi = Counting Convention Factor groups of activities
w i = Equivalent new fractional cost
j = SLOC categories that are considered " Not New"
• Use constrained optimization to solve for coefficients that minimize residuals of estimated
effort at both the project level and chunk level
Slide 14
Presented at 2009 ISPA/SCEA Workshop
15. Initial Results
Effort = A * ESLOCB*CPLATFORM*DDEV-TYPE*E(CMM-4 23)
• Eff (CMM-4.23)
1 ESLOC
SW Activity A B C D E SW Activity
New Mod Reused Carryover Autogen
Requirements 0.00470 1.1100 1.6 0.89 0.91 Requirements 1.0000 0.5624 0.2200 0.0420 0.0400
Architecture 0.00110
0 00110 1.1733
1 1733 1.6
16 0.89
0 89 0.91
0 91 Architecture 1.0000
1 0000 0.5624
0 5624 0.2200
0 2200 0.0420
0 0420 0.0400
0 0400
Initial Design 0.00190 1.1733 1.6 0.89 0.91 Initial Design 1.0000 0.5624 0.2200 0.0420 0.0400
Detailed Design 0.00450 1.1733 1.6 0.89 0.91 Detailed Design 1.0000 0.5624 0.2200 0.0420 0.0400
Code & Unit Test 0.03000 1.0893 1.6 0.89 0.91 Code & Unit Test 1.0000 0.5624 0.2200 0.0420 0.0400
Formal Integration 0.00400 1.2393 1.6 0.89 0.91 Formal Integration 1.0000 0.5624 0.2200 0.0420 0.0400
Integration Testing 0.01540 1.2393 1.6 0.89 0.91 Integration Testing 1.0000 0.5624 0.2200 0.0420 0.0400
System Testing 0.00450 1.2393 1.6 0.89 0.91 System Testing 1.0000 0.5624 0.2200 0.0420 0.0400
Acceptance Testing 0.00370 1.2393 1.6 0.89 0.91 Acceptance Testing 1.0000 0.5624 0.2200 0.0420 0.0400
Configuration Mgt 0.00540 1.1116 1.6 0.89 0.91 Configuration Mgt 1.0000 0.5624 0.2200 0.0420 0.0400
Project Plans 0.00100 1.1116 1.6 0.89 0.91 Project Plans 1.0000 0.5624 0.2200 0.0420 0.0400
Program Mgt 0.00390 1.1116 1.6 0.89 0.91 Program Mgt 1.0000 0.5624 0.2200 0.0420 0.0400
Q
Quality Assurance
y 0.00800 1.1116 1.6 0.89 0.91 Q
Quality Assurance
y 1.0000 0.5624 0.2200 0.0420 0.0400
1
Included in Normalized Database SLOC = Logical Lines of Code
Average Percent
PRED(20)
Error
Data Pts Data Pts Data Pts Data Pts
Included Excluded Included Excluded
Project
j 4% 200% 80% 20%
Chunk 169% 344% 26% 9%
Slide 15
Presented at 2009 ISPA/SCEA Workshop
16. Initial Results (Cont’d)
700,000 Full Development (Included in Solution)
600,000 Partial Development (Exlcluded from Solution)
Std, Nor Total Proje Effort
Aggregate EER Estimate Trend
500,000
ect
400,000
300,000
Data points are normalized to
rm
common set of activities and
f d
200,000
standardized to remove effects of
Air Platform, CMM, and Multi-
100,000
vendor development.
0
0 100,000 200,000 300,000 400,000 500,000
Standardized Avg ESLOC
Slide 16
Presented at 2009 ISPA/SCEA Workshop
17. Application of EERs
Breaking Apart Chunks (“Yellows”) Estimating Missing Activities (“Reds”)
EER i j EER i
Factori = Factori =
∑ EER i j ∑ EER i
where where
EER ij = EER Estimate of activity i in chunk j EER i = EER Estimate of activity i
∑ EER i j
= Estimate of all activities that are included in chunk j ∑ EER i = Estimate of all activities that are included in the actual project effort
E i = Factor i * E CH j
ˆ E i = Factori * E Project
ˆ
where E CH j = Actual effort of chunk j where E Project = Actual Project Effort
• E ti ti equations are used relatively t
Estimating ti d elati el to
add/remove activities from normalized effort
Slide 17
Presented at 2009 ISPA/SCEA Workshop