SlideShare a Scribd company logo
COST ASSESSMENT DATA ENTERPRISE
Software Resource Data Reporting (SRDR)
2
Overview
Software Resource Data Reporting (SRDR)
Resources
Data Item Description (DID):
DI-MGMT-82035
Development, Format 1:
DD Form 3026-1
Maintenance, Format 2:
DD Form 3026-2
Enterprise Resource Planning (ERP):
DD Form 3026-3
Intent of SRDR is to collect objective and measurable data
commonly used by industry and DoD cost analysts
Builds a repository of estimated and actual:
› SW product sizes
› Schedules
› Effort
› Quality
3
New Formats
Software Resource Data Reporting (SRDR)
Technical Data
SW size, context, technical
information
Release level and computer SW
configuration item (CSCI) level
sections
Effort Data
Reports SW efforts associated
with each reported release and
CSCI
Technical Data
SW size, context, technical
information
Top level and release level
sections
Effort Data
Reports the to-date SW
maintenance efforts for each in-
progress and completed
release(s), and total maintenance
activities
Technical Data
Provides context, SW product,
Development Team, etc.
Effort Data
Project resource and schedule
information at the release level
Part
1
Part
2
Development Maintenance ERP
4
Updates
Software Resource Data Reporting (SRDR)
No Additional Significant Updates to Format 1 and 2
Format 1 Updates
Introduction of Agile Measures
› Supports capturing the LOE with this SW dev
methodology
Instructions captured within DID
Format 2 Updates
Clarified SW change count definitions
› Confusion in original DID regarding inclusion of
Information Assurance Vulnerability Alerts (IAVAs)
› Updated instructions to ensure IAVA changes are
mutually exclusive from Priority 1-5 changes, and are
not double counted
Moved SW License Management Hours reporting to
Project Management (PM)
› SW License Management is a PM activity
Development Maintenance
5
Submission Event Process Flow Diagram
Software Resource Data Reporting (SRDR)
Event 1
Initial
Contract
Initial
Event 2
Interim
Event 3
Interim
Event 4
Interim
Complete
Event 5
Interim
Release 1
Release
Initial
Release 2
Contract .
Contract
Interim
Release
Interim
Release 3
Release
Final
Contract
Final
No Change from
previous report
Release 1
Release 2
Release 2
Release 3
Release 4
This reflects
what is
contained in
each individual
submission
This reflects the
individual
submissions
Estimate
In-Process/Actuals
Final/Actuals
Final/Actuals Final/Actuals
Estimate Estimate
Estimate Estimates Estimate
In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals
Release 4
All Release
Reports Due
COST ASSESSMENT DATA ENTERPRISE
Software Resource Data Reporting (SRDR)
Presented by:
Paul Kopicki, paul.a.kopicki.civ@mail.mil
7
AGENDA
Software Resource Data Report (SRDR)
› Software DID Overview (Development/Maintenance/ERP)
› Software Data DID CRM Review
› Software Data DID Path Forward
Software DID Overview
(Development/Maintenance/ERP)
9
Software DID Overview
Software DID Overview (Development/Maintenance/ERP)
› Software Resources Data Reporting:
• Data Item Description: DI-MGMT-82035
• Development, Format 1: DD Form 3026-1
• Maintenance Report, Format 2: DD Form 3026-2
• Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
10
Data Item Description (DID): DI-MGMT-82035
Software DID Overview (Development/Maintenance/ERP)
› DID summarizes the Software (SW) Development Report, Maintenance Report,
and ERP SW Development Report
› Provides instructions to support data and Frequency requirements specified
in the CSDR reporting
› Intent of SRDR is to collect objective, measurable data commonly used by
industry and DoD cost analysts
› Builds a repository of estimated and actual:
• SW product sizes
• Schedules
• Effort
• Quality
11
Development, Format 1: DD Form 3026-1
Software DID Overview (Development/Maintenance/ERP)
Consists of two parts
Part 1 – SW Development Technical Data
› SW size, context, technical information
› Release Level and Computer SW Configuration
Item (CSCI) Level sections
Part 2 – SW Development Effort Data
› Reports SW efforts associated with each
reported release and CSCI
Development, Format 1: DD Form 3026-1
Software DID Overview (Development/Maintenance/ERP)
12
Format 1 updates:
› Introduction of Agile Measures
› Instructions captured within DID ex: AAAA 1.x.x 5 3 15 10 200 180
Release Map SECTION 3.3.2.6.5
Planned and Achieved Development
(by Feature per Epic) SECTION 3.3.2.6.6
Epic/
Capability
Identifier
Feature Identifier
Planned Stories
per Feature by
Epic
Actual Stories
Planned Story
Points per Feature
by Epic
Actual
Story
Points
Planned Hours
per Feature by
Epic
Actual
Feature
Hours
(NOTE: Insert rows as needed to account for all Features and Epics mapped to this Release.)
Summary Totals for This Release SECTION
3.3.2.6.7
(Sum of all rows with data by Feature by Epic)
Quantity
Actually
Developed
Quantity
Planned
Item
Agile Measures Based: SECTION 3.3.2.6.4 Days per Sprint:
Days per Release:
Total Features
Total Epics/
Capabilities
Total Stories
Total Story
Points
Total Feature
Hours
Total Sprints
No Additional Significant Updates to Format 1
13
Maintenance Report, Format 2: DD Form 3026-2
Software DID Overview (Development/Maintenance/ERP)
Consists of two parts
Part 1 – SW Maintenance Technical Data
› SW size, context, technical information
› Top Level and Release Level sections
Part 2 – SW Maintenance Effort Data
› Reports the to-date SW maintenance efforts
for each in-progress and completed release(s),
and total maintenance activities
14
Maintenance Report, Format 2: DD Form 3026-2
Software DID Overview (Development/Maintenance/ERP)
Format 2 updates
› Clarified SW change count definitions
• Confusion in original DID regarding inclusion of Information Assurance Vulnerability Alerts (IAVAs)
• Updated instructions to ensure IAVA changes are mutually exclusive from Priority 1-5 changes,
and are not double counted
› Moved SW License Management Hours reporting to Project Management (PM)
• SW License Management is a PM activity
No Additional Significant Updates to Format 1
15
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
SW Development Report specifically for ERP or Defense Business System programs
› Financial
› Personnel
› Inventory
Consists of two parts
Part 1 – System Technical Data
› Provides context, SW product, Development
Team, etc.
Part 2 – SW Development Effort Data
› Project resource and schedule information at th
Release Level
16
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 1 – Technical Data
A. Report Context 3.3.1
Description of Actual Development Organization
B. Product and Team Description 3.3.2
Microcode and Firmware Communication Software Tools
Signal Processing System Software Mission Planning
Vehicle Payload Process Control Custom AIS Software
Vehicle Control Scientific and Simulation Enterprise Service System
Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System
Command and Control Training
10. Primary Application Domain comments 3.3.2.3
11. Comments on Subsection B responses: 3.3.2.4
3. Certification Date: 3.3.1.4 4. Lead Evaluator of CMM Certification:
3.3.1.5
5. Affiliation to Development Organization: 3.3.1.6
ENTERPRISE RESOURCE PLANNING (ERP) SOFTWARE RESOURCES DATA REPORTING, FORMAT 3:
PART 1 Software Development Technical Data (Project) SECTION 3.3
1. Release Name: 3.3.1.1 Release ID: 3.3.1.2
2. Certified CMM Level (or equivalent): 3.3.1.3
8. Project Functional Description: 3.3.2.1
9. Primary Application Domains for ERP Programs 3.3.2.2
6. Precedents (list up to ten similar systems completed by the same organization or team): 3.3.1.7
7. Comments on Subsection A responses: 3.3.1.8
17
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 1 – Technical Data
18
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
19
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
20
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
Software Data DID
CRM Review
21
SRDR DID & Forms CRM Review
Software Data DID CRM Review
22
Document # of Comments Taking Action No Action
General SRDR
DID/Form
20 7 13
Maintenance
Form
11 7 4
ERP Form 6 2 4
37 16 21
7 8
2
4
2
10
4
0
2
4
6
8
10
12
14
16
18
20
DASA-CE LM SSC-SEIT MDA/CP NAVAIR
Taking Action No Action
11
2
18
6
37 Comments
Several rounds of comments through Gov. and Industry
Last round of DID and Forms sent in April 17;
Reviewed/Adjudicated comments with SRDR WG through early May
› Open comments coordinated with developers and
closed out with POCs end of May
SRDR DID & Forms CRM Review
Software Data DID CRM Review
23
7 8
2
4
2
10
4
0
2
4
6
8
10
12
14
16
18
20
DASA-CE LM SSC-SEIT MDA/CP NAVAIR
Taking Action No Action
11
2
18
6
37 Comments
Major DID Changes since May 2017
› Agile metrics reporting
• From: All agile programs
instructed to use ERP instructions
and submit agile data using
Format 3
• To: Format 1 was updated to
include Agile table too
› New version of Figure 1 to clarify intent
Software Data DID
Backup
24
25
Agile Measures
Software Data DID Backup
For clarify: Cross references from Form 1 to Form 3 were eliminated by moving the Agile Measures section
into Form 1
Comment Proposed Solution
For clarity and consistency, update the Traditional
Development Form and DID language to include the Agile
Measures section that is currently found within the ERP section
of the DID.
Updated the Forms and DID to duplicate the ERP agile
measures language and requirement within the Development
form. Agile measures is now reflected on both forms and both
sections in the DID
26
Submission Event Process Flow Diagram
Software Data DID Backup
Comment Proposed Solution
Text contradicts the process flow diagram (Figure 1) found within
the DID. Provide clarification to update the text and image, so it is
in alignment with reporting (initial, interim, final) deliverables and
expectations.
Updated Figure 1 and associated text.
Notional submission event diagram and associated text was confusing
To avoid confusion, Figure 1 was updated (next slide)
27
Submission Event Process Flow Diagram
Software Data DID Backup
Event 1
Initial
Contract
Initial
Event 2
Interim
Event 3
Interim
Event 4
Interim
Complete
Event 5
Interim
Release 1
Release
Initial
Release 2
Contract .
Contract
Interim
Release
Interim
Release 3
Release
Final
Contract
Final
No Change from
previous report
Release 1
Release 2
Release 2
Release 3
Release 4
This reflects
what is
contained in
each individual
submission
This reflects the
individual
submissions
Estimate
In-Process/Actuals
Final/Actuals
Final/Actuals Final/Actuals
Estimate Estimate
Estimate Estimates Estimate
In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals
Release 4
All Release
Reports Due
DID Comments Summary
Software Data DID Backup
28
Of the 37 comments….
› Comments bucketed into 7 categories
› Majority of comments were for
additional Clarification or Questions
› Comments were not received
differentiating between critical,
substantive, or administrative
18
10
3 3
1 1 1
0
2
4
6
8
10
12
14
16
18
20
# of Comments
Clarification……………………….…………. 18
Question………………………………….…… 10
Administrative…………………………..…. 3
Process Diagram…………………………... 3
Duplication Comment…………………… 1
Not Actionable………………….………….. 1
General..…………..……………..…………… 1
37 Comments
29
Development, Format 1: DD Form 3026-1, Initial Report, Common Heading- Faked
Software Data DID Backup
PRIME MISSION PRODUCT AH-95 Karuk
PHASE/MILESTONE: REPORTING ORGANIZATION TYPE
Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR
A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR
GOVERNMENT
PERFORMING ORGANIZATION DIVISION
a. NAME: Stark Industries a. NAME: Stark Aircraft
b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd.
Orlando, FL Orlando, FL
APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER
d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01
REPORT TYPE INITIAL X INTERIM FINAL
SUBMISSION NUMBER 2
X RDT&E RESUBMISSION NUMBER 0
PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930
X O&M DATE PREPARED (YYYYMMDD)
DEPARTMENT
REMARKS:
20121130
b. END DATE (YYYYMMDD):
c. SOLICITATION NO.: N/A
b. MODIFICATION NO.: 1
APPROPRIATION
N/A
20170930
TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001
PERIOD OF PERFORMANCE
20120501
a. START DATE (YYYYMMDD):
Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com
POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS
UNCLASSIFIED
SECURITY CLASSIFICATION
All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS
elements.
Appropriation is is both RDT&E and O&M (same as associated CSDR)
Software Development Report
UNCLASSIFIED
SECURITY CLASSIFICATION
SOFTWARE RESOURCES DATA REPORTING: Metadata
MAJOR PROGRAM NAME: AH-95 Karuk
The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including
suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09,
Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of
information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION.
OMB Control Number 0704-0188
Expiration Date: 8/31/2016
30
Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked
Software Data DID Backup
1.0
Release Start Date 20120501 Release End Date
Hours Per Staff Month Computed ? No
1.1.2.1 Mission Computer Update Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X
1.1.2.1 Smart Display Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X
1.2.1 Software Systems Engineering Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD SW Con SW I SW QT SW SP
1.3.1 Software Program Management Definition
ISO 12207 Processes included… SW RA X SW AD SW DD X SW Con X SW I SW QT X SW SP
1.4.1.1 Development Test and Evaluation (SW-Specific) Definition N/A - Regression Testing is included as part of the Development Activities
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
1.5.1 Training (SW-Specific) Definition N/A - Training is part of Program Management
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
1.6.2.1 Engineering Data (SW-Specific) Definition N/A - Data is provided in the comments of the Mission Computer Update
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
Contractor-Defined Software
Development Activities
LEAD EVALUATOR
CMMI Level 3
CERTIFICATION DATE
SOFTWARE PROCESS
MATURITY
20110923
The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It
provides the central computation and control facility that integrates the avionics suite into a cohesive system. The Mission Computer
DEVELOPMENT ORGANIZATION
CMMI Institute
James Rhodes
EVALUATOR AFFILIATION
SOFTWARE DEVELOPMENT REPORT, FORMAT 1: Part 1 Software Development Technical Data, Release Level
Release ID Release Name Mission Computer Update 2018 Baseline
20130930
Software Requirements Count Definition
Total Labor Hours Total Staff
135000
The Software Requirements count is defined by counting the number of "Shall" statements in the latest Software Requirements Specification (SRS). This includes func
External Interface Requirements Count Definition The External Interface Requirements count is defined by counting the number of unique Interface statements in each Subsystem Interface Description Document (IDD).
154 112
Software-Specific Common Elements
(as defined in the CSDR Plan)
Product Quality Reporting Definition
The Smart Display (SD) is a software application that provides flight instrument graphics. It is used in cockpit display systems to
support flight and mission activities.
Program Management performed by Stark Industries in Orlando, FL consists of team management and training utilizing spiral
development strategies through the life of the program
Systems Engineering performed by Stark Industries in Orlando, FL consists of software requirements and initial design efforts
31
Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked
Software Data DID Backup
Aerospace UCC Version
Alternate Code Counter Description
DD Form3026-1, MAY 2016
UNCLASSIFIED
SECURITY CLASSIFICATION
RELEASE-LEVEL COMMENTS
The precedents listed above are similar projects in that they all share the engineers and development team. They also use the same development, lab environments and software tools.
Alternate Code Counter Comparison to UCC
Out of a comparison of 381 source files, the UCC tool counted 1.3% more
logical lines of source code than Code Counter Plus.
Third-Party Code Counter
N/A Alt Code Counter Version 2.0
Flight Armor Software Mark 3
Automobile Flight Software Mark 21
Personal Helicopter Mission Computer
Alternate Code Counter Name Code Counter Plus
SYSTEM DESCRIPTION The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air Combat.
This software is an update to the Mission Computer for the helicopter.
PRECEDENTS (List at least three similar systems.)
32
Development, Format 1: DD Form 3026-1, Initial Report, CSCI Level - Faked
Software Data DID Backup
Product and Development Description
Functional Description
Software Development Characterization
Software State of Development (Check one only ) Prototype Production-Ready X Mix
Operating Environment(s) (Check all that apply )
Surface Fixed Surface Vehicle Ordnance Systems Other
Surface Mobile X Air Vehicle Missile Systems If Other, provide explanation
Surface Portable Sea Systems Space Systems
X Manned Unmanned
Primary Application Domain (Check one only )
Microcode and Firmware Communication Software Tools
Signal Processing X System Software Mission Planning
Vehicle Payload Process Control Custom AIS Software
Vehicle Control Scientific and Simulation Enterprise Service System
Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System
Command and Control Training
Application Domain Comments
Development Process
Software Development Method
This product is a new development No This is a modification of an existing Mission Computer
N/A - The program is currently in requirements development phase; Maintenance will begin immediately upon deployment
The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It provides the central computation and control facility that integ
Spiral
The methodology used for Software Development is the standard Spiral development process that has been used previously. The Software Development team will receive software requireme
33
Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked
Software Data DID Backup
Software Reuse
Name Description
Name Description etc.
COTS / GOTS / Open-Source Applications
Name Glue Code Estimated Configuration Effort Estimated
Name Glue Code Estimated Configuration Effort Estimated etc.
COTS/ GOTS Comments
STAFFING (Initial and Interim Reports only )
Peak Staff Peak Staff Date
Product and Development Description Comments
Product Size Reporting
Software Requirements Count
Total 23854 Inherited Requirements 13502 4999 Modified Requirements 0
Deleted Requirements 0 Deferred Requirements 1535 Requirements Volatility
Certification and Accreditation Rqts Security Requirements 3250 Safety Requirements 568 Privacy Requirements 0
Software Requirements Comments
No requirements are implemented at this time, as this is the Initial Report and no work has yet been done.
External Interface Requirements Count
Total 1585 Inherited Requirements 950 270 Modified Requirements
Deleted Requirements Deferred Requirements 335 Requirements Volatility
Certification and Accreditation Rqts Security Requirements 30 Safety Requirements Privacy Requirements
External Interface Requirements Comments
The previous Mission Computer will be the baseline for the Upgrade
Mission Computer
Added/New Requirements
Added/New Requirements
167 20130101
This is an Upgrade to the existing Mission Computer, so while there will be new development, it is not a new program
No requirements are implemented at this time, as this is the Initial Report and no work has yet been done.
N/A
34
Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked
Software Data DID Backup
SLOC-Based Software Size
Primary Language (L1)
Secondary Language (L2)
L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other
0 0 0
0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF%
0 0 0
0 0 0 0% 0% 0% 0%
0 0 0
766321 105230 0 0% 0% 766321 105230 0% 0%
0 0 0
0 0 0 0% 0% 0% 0%
766321 105230 0 766321 105230 0 0 0 0
0 0 0
L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other
135268 1795 0 135268 1795
0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF%
0 0 0
0 0 0 0% 0% 0% 0%
0 0 0
766321 105230 0 0% 0% 766321 105230 0% 0%
0 0 0
0 0 0 0% 0% 0% 0%
901589 107025 0 901589 107025 0 0 0 0
0 0 0
AMOUNT OF DELIVERED CODE DEVELOPED NEW
ACTUAL COUNTS TO DATE (Interim and Final Reports only )
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE DEVELOPED NEW
AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS
RELEASE)
AMOUNT OF GOV'T FURNISHED CODE
AMOUNT OF DELETED CODE
AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT
CARRYOVER FROM PREVIOUS RELEASE)
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS
RELEASE)
TOTAL DELIVERED CODE
AMOUNT OF DELETED CODE
WITH
MODIFICATIONS
AUTO
GENERATED
HUMAN
GENERATED
AUTO
GENERATED
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT
CARRYOVER FROM PREVIOUS RELEASE)
HUMAN
GENERATED
AMOUNT OF GOV'T FURNISHED CODE
WITHOUT
MODIFICATIONS
TOTAL DELIVERED CODE
ESTIMATES AT COMPLETION (Initial and Interim Reports only )
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
35
Maintenance Report, Format 2: DD Form 3026-2, Initial Report Common
Heading - Faked
Software Data DID Backup
PRIME MISSION PRODUCT AH-95 Karuk
PHASE/MILESTONE: REPORTING ORGANIZATION TYPE
Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR
A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR
GOVERNMENT
PERFORMING ORGANIZATION DIVISION
a. NAME: Stark Industries a. NAME: Stark Aircraft
b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd.
Orlando, FL Orlando, FL
APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER N/A
d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01
REPORT TYPE INITIAL X INTERIM FINAL
SUBMISSION NUMBER 2
X RDT&E RESUBMISSION NUMBER 0
PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930
X O&M DATE PREPARED (YYYYMMDD)
DEPARTMENT
REMARKS:
UNCLASSIFIED
SECURITY CLASSIFICATION
SOFTWARE RESOURCES DATA REPORTING: Metadata
MAJOR PROGRAM NAME: AH-95 Karuk
The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including
suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09,
Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of
information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION.
OMB Control Number 0704-0188
Expiration Date: 8/31/2016
TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001 b. MODIFICATION NO.: 1 c. SOLICITATION NO.: N/A
PERIOD OF PERFORMANCE APPROPRIATION
a. START DATE (YYYYMMDD): 20120501
b. END DATE (YYYYMMDD): 20121130
POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS
20170930
UNCLASSIFIED
SECURITY CLASSIFICATION
Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com
All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS
elements.
Appropriation is is both RDT&E and O&M (same as associated CSDR)
36
Maintenance Report, Format 2: DD Form 3026-2, Initial Report Part 1 - Faked
Software Data DID Backup
System Description
No. of Unique Baselines Maintained 1 - Mission Computer Update 2018 Baseline
No. of Total Hardware Platforms This Software Operates On
X Extensive Regular Event Driven Other
If Other, provide explanation N/A
MAINTENANCE ORGANIZATION
James Rhodes
CERTIFICATION DATE 20110923
LEAD GOVERNMENT ORGANIZATION
LOCATION (Enter the lead government maintenance location)
Software Requirements Count Definitions
External Interface Requirements Count Definitions
The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air
Combat.
This software is an update to the Mission Computer for the helicopter.
4. The Mission Computer (MC) Block C and D hardware.
The Flight Management Computer (FMC) Block B and C hardware.
PRECEDENTS (List at least three similar systems by the same organization or team.)
Flight Armor Software Mark 3
Automobile Flight Software Mark 21
Personal Helicopter Mission Computer
GOVERNMENT
ORGANIZATION NAME
Army Aviation Command
Suite 235
BLDG H-21
Huntsville, AL
The Software Requirements count is defined by counting the number of "Shall" statements in the
latest Software Requirements Specification (SRS). This includes functional, non-functional,
security, safety, privacy and cyber security requirements.
The External Interface Requirements count is defined by counting the number of unique Interface
statements in each Subsystem Interface Description Document (IDD). This includes functional,
non-functional, security, safety, privacy and cyber security requirements.
SOFTWARE MAINTENANCE REPORT, FORMAT 2: Top Level PART 1 Software Maintenance Technical Data
Operation Tempo (check one)
SOFTWARE PROCESS
MATURITY
CMMI
Maturity Level 3
LEAD EVALUATOR EVALUATOR AFFILIATION CMMI Institute
COST ASSESSMENT DATA ENTERPRISE
SURF Process Summary & Initial Findings:
A Deeper Focus on Software Data Quality
This document was generated as a result of the AFCAA-led, Software Resource Data Report Working Group (SRDRWG). This working group represented a joint effort amongst all DoD service cost agencies. The following guidance describes SRDR data
verification and validation best practices as documented by NCCA, NAVAIR 4.2, AFCAA, ODASA-CE, MDA, and many more.
Presented by:
Ranae Woods, AFCAA, ranae.p.woods.civ@mail.mil
Dan Strickland, MDA , daniel.strickland@mda.mil
Nicholas Lanham, NCCA, nicholas.lanham@navy.mil
Marc Russo, NCCA, marc.russo1@navy.mil
Haset Gebre-Mariam, NCCA,
haset.gebremariam@navy.mil
38
Table of Contents
SURF Process Summary & Initial Findings
› Purpose
› SRDR Need Statement
› SURF Purpose
› SURF Created Process Initiation
› SURF Team Structure
› SURF Verification & Validation (V&V) Guide
› SRDR V&V Process
› SRDR Database
› SRDR Data Quality Review
› SURF Status and Metrics
› Summary
39
Presentation Purpose
SURF Process Summary & Initial Findings
› To familiarize the audience with recent Software Resource Data Report
(SRDR) Working Group (WG) efforts to update existing SRDR DID language
and implement data quality improvement
› To clarify how these SRDRWG efforts led to the development of a SRDR
Unified Review Function (SURF) team
› To highlight:
− SURF mission
− Highlight SURF team and Verification and Validation (V&V) guide positive impact
on SRDR data quality
40
SURF Need Statement
SURF Process Summary & Initial Findings
Why do these reports need to be reviewed?
› Reduces inaccurate use of historical software data
– Aligns with OSD CAPE initiative(s) to improve data quality
› Helps correct quality concerns prior to final SRDR acceptance
› Allows a central group of software V&V SMEs to tag SRDR data
› SRDR submissions are used by all DoD cost agencies when developing or assessing cost
estimates
› Quality data underpins quality cost and schedule estimates
BBP Principle 2: Data should drive policy. Outside my door a sign is posted that reads, "In God We Trust; All Others
Must Bring Data." The quote is attributed to W. Edwards Deming
- Mr. Frank Kendall, AT&L Magazine Article, January-February 2016
Question: What services helped develop the questions included within the latest SRDR V&V guide?
Answer: All services participating in the SRDR WG provided feedback, comments, and reviews over a year long
SRDRWG effort focused on establishing higher quality review efforts coupled with an ongoing SRDR DID update 41
SURF Purpose
SURF Process Summary & Initial Findings
Purpose
› To supplement the Defense Cost Resource Center (DCARC) quality review for SRDR submissions
› To develop a consistent, service-wide set of quality questions for all DoD cost community members
to reference
› To provide a consistent, structured list of questions, focus areas, and possible solutions to cost
community members tasked with inspecting SRDR data submissions for completeness, consistency,
quality, and usability (e.g. SRDR V&V Guide)
Why?
› SURF represents an effort to establish a consistent guide for any organization assessing the realism,
quality, and usability of SRDR data submissions
› Quality data underpins quality cost and schedule estimates
Question: How was the SURF team created and is it linked to the SRDRWG?
Answer: Yes. The SRDR Unified Review Function (SURF) team was organized as part of the larger, SRDRWG initiative
during 2015 42
How Was SURF Created?
SURF Process Summary & Initial Findings
1. Revised SRDR Development Data Item
Description (DID)
2. New SRDR Maintenance Data Item
Description (DID)
3. Joint Validation & Verification (V&V)
Guide, Team, and Process
4. Software Database Initial Design and
Implementation Process
1. Reduces inconsistency, lack of visibility,
complexity, and subjectivity in reporting
2. Aligned w/ dev. but w/ unique data/metrics
available/desired for maintenance phase
3. Higher quality, less duplication - ONE central
vs many distributed; 1 joint team & guide
gives early, consistent feedback to ktrs
4. Avoids duplication, variations - ONE central
vs many distributed; Based on surveyed
best practices and user expectations
Recommendation Benefit
Question: How do members get involved with SURF? Why are there “primary” and “secondary” members?
Answer 1: The SURF team was established by Government SRDRWG members who were recommended/volunteered by each DoD service
Answer 2: Primary members are included on CSDR S-R IPT email notifications for their specific service. Secondary members are contacted during
periods of increased review demands, if necessary.
SURF Secondary:
SURF Primary:
DCARC Analyst & STC:
DoD
William
Raines
Navy
Corrinne Wallshein
Wilson Rosa
Stephen Palmer
Philip Draheim
Sarah Lloyd
Marine Corps
Noel Bishop
John Bryant
Air Force
Ron Cipressi
Janet Wentworth
Chinson Yew
Eric Sommer
Army
Jim Judy
Jenna Meyers
James Doswell
Michael Smith
Michael Duarte
SPAWAR
Jeremiah
Hayden
Min-Jung
Gantt
MDA
Dan
Strickland
43
SURF Team Structure
SURF Process Summary & Initial Findings
› Team is comprised of one primary member per service along with support from secondary team members
(Government Only)
› As submissions are received, SRDR review efforts will be distributed amongst SURF team members to
balance workload
› SURF Team Coordinators (STC): Marc Russo & Haset Gebre-Mariam
› Current SURF structure:
SURF Team Coordinators (STC)
Marc Russo
Haset Gebre-Mariam
SURF Advisor & Process Owner
(SAPO)
Nick Lanham
SRDR Submission received from
DCARC
Question: Did a standardized-joint service, software-specific quality review guide exist prior to the SURF V&V guide? Who contributed to the
development of this document?
Answer 1: No. Services implemented very inconsistent SRDR review methodologies (if conducted at all) prior to DCARC acceptance
Answer 2: The SRDR V&V guide was developed by the SURF team and has been reviewed by numerous SRDRWG, OSD CAPE, and other cost
community team members. Feedback from other services has generated significant improvements from initial draft.
44
SRDR V&V Guide
SURF Process Summary & Initial Findings
Guide represents first-ever, joint effort amongst DoD
service cost agencies
› OSD public release approved 5 April 2016
› Kickoff email distributed on 1 May 2017 to
update guide with latest DID requirements
› Files can be downloaded using following link:
http://cade.osd.mil/roles/reviewers#surf
Enables ability to consistently isolate software cost
relationships and trends based on quality SRDR data
› Now includes quick-reference MS excel question
checklist by SRDR DID section
Two main purposes:
› SRDR V&V training guide (V&V questions)
› Focus areas used to determine SRDR quality tags
V&V Questions and Examples Developed and Organized by Individual SRDR
reporting Variable 45
SRDR V&V Guide Table of
Contents (TOC)
SURF Process Summary & Initial Findings
1.0 Review of an SRDR submitted to DCARC
1.1 Reporting Event
1.2 Demographic Information
1.3 Software Char. and Dev. Process
1.3.1 Super Domain and Application Domains
1.3.2 Operating Environment (OE) Designation
1.3.3 Development Process
1.4 Personnel
1.5 Sizing and Language
1.5.1 Requirements
1.5.2 Source Lines of Code (SLOC)
1.5.3 Non-SLOC Based Software Sizing
1.5.4 Product Quality Reporting
1.6 Effort
1.7 Schedule
1.8 Estimate at Completion (EAC) Values
2.0 Quality Tagging
3.0 Solutions for Common Findings
3.1 Allocation
3.2 Combining
3.3 Early Acquisition Phase Combining
4.0 Pairing Data
5.0 Possible Automation
Appendix A – SD and AD Categories
Appendix B – Productivity Quality Tags
Appendix C – Schedule Quality Tags
Appendix D – SRDR Scorecard Process
V&V Guide Includes Specific Questions For SURF Members to Confirm Prior to
Accepting the Report 46
SRDR V&V Guide Example Questions
SURF Process Summary & Initial Findings
– Was effort data reported for each CSCI or WBS?
– Was effort data reported as estimated or actual results? If the submission includes estimated values
and actual results, does the report include a clear and documented split between actual results and
estimated values?
– Is the effort data reported in hours?
– Is effort data broken out by activity?
– What activities are covered in the effort data? Is there an explanation of missing activities included
within the supporting SRDR data dictionary? ….
When assessing Effort, the V&V priority is determining completeness
Determining completeness is not always easy due to:
– The contractor possibly collecting/reporting their actual performance using categories that differ
from the IEEE 12207 standard
– The contractor reporting all of their Effort within the Other category
Common questions to ask when looking at the effort are:
Section 1.6: Effort
DCARC: Step 1
•SRDR status list
sent to SURF Team
Coordinator
SURF: Step 1
•SRDR status list
distributed to
Primary and
Secondary POCs
SURF: Step 2
•Conduct V&V
reviews by
populating MS
Excel question
template
SURF: Step 3
•Provide completed
V&V question
templates back to
DCARC
DCARC: Step 2
•Combine SURF and
DCARC comments
•Coordinate
comment
resolution with
submitting
organization
Database: Step 1
•Adjudicated SRDR
sent to NAVAIR 4.2
for data entry into
DACIMs dataset
•Note: Future
database(s) will be
hosted via CADE
Varies by No.
Submissions
47
SURF Team V&V Process
SURF Process Summary & Initial Findings
1st week of
every month
+2 Days + 13 Days
NLT
+ 14 Days
Varies by
Contractor
Purpose of SURF Process: To provide completed V&V checklists to DCARC within 2 weeks of request
Important Note: CADE is developing relational databases for new DID formats. Over time, data entry will be automated.
Until that time, manual data entry continues by NAVAIR 4.2 team for only the development format. Please refer to V&V
guide for additional automation details and future data quality initiatives
Monthly Recurring SURF and DCARC Communications
48
SRDR Database Location
SURF Process Summary & Initial Findings
› Login to CADE and click on
“CADE Portal Login”
− http://cade.osd.mil/
› Select “My CADE” from the
menu, and click on “CADE
Library”
› Use to Advanced Search and
filter on “Container Types”
under “Software Data”
Question: Where does SRDR data go after SURF Review?
Answer: Once SRDR record has been accepted, Data is entered into SRDR dataset posted to CADE>DACIMs web portal
Question: Who enters the data into the dataset?
Answer: Currently members from NAVAIR 4.2 enter data to SRDR dataset (10+ years of experience). Future data entry is
planned to be automated using .XML schemas linked to latest DID formats
› SRDR database is available to Government analysts with access to the CADE portal
– This dataset is the authoritative source for SRDR data (10+ years of uploads)
› Data is not automatically considered “Good” for analysis
› SURF team may recommend DCARC not accept a submission due several data quality concerns
outlined in the V&V guide. Examples include:
– Roll-up of lower level data (Did not want to double count effect)
– Significant missing content in hours, productivity, and/or SLOC data missing
– Interim build actual that is not stand alone
– Inconsistencies or oddities in the submit
– Additional reasons discussed in the V&V guide
49
SRDR Data Quality Review
SURF Process Summary & Initial Findings
Data Segments Dec-07 Dec-08 Oct-10 Oct -11 Aug-13 Apr-14 Apr-15 Dec-16
CSCI Records 688 964 1473 1890 2546 2624 2853 3487
Completed program or build 88 191 412 545 790 911 1074 1326
Actuals considered for analysis (e.g., “Good”) 0 119 206 279 400 403 682 829
Paired Initial and Final Records 0 0 78 142 212 212 212 240
Dataset Posted to OSD CAPE DACIMS Web Portal
SURF Team Combined With V&V Guide and DCARC
Have Significantly Improved Software Data Quality
› Prior to SURF process, only 15% of SRDR data was considered “Good”
› After one+ year of SURF reviews, ~24% of data has been tagged as “Good”
› Army team currently working to review historical data. Once completed, “Good” percentage
will likely increase to ~31%
50
SRDR Data Quality Review
SURF Process Summary & Initial Findings
1890
2546 2624
2853
3487
279
400 403
682
829
0
500
1000
1500
2000
2500
3000
3500
4000
11-Oct 13-Aug 14-Apr 15-Apr 16-Dec
Total Records (e.g., CSCIs) Actuals considered for analysis (e.g., “Good”)
2011-2017 Trend Analysis
51
SURF Team Status
SURF Process Summary & Initial Findings
› Recurring SURF team meetings kicked off on 23 June 2015
– Group includes ~19 Government team members from across the DoD
– Has received very positive feed back from DoD cost estimation community, DCARC analyst(s), and even program office
communities since inception
– Completed initial version of SRDR V&V guide March 2015
– Initiated SURF Initial team training using draft V&V guide June 2015
– Completed development of SURF team charter July 2015
– During training period, SURF generated 483 V&V comments provided to DCARC (June 2015 to March 2016)
– Completed official SURF kickoff with DCARC and published V&V guide March 2016
– After training period, formal SURF process generated 889 V&V comments (March 2016 to December 2016)
– Concluding CY16, SURF team generated 1,372 V&V comments from 92 SRDR submissions (1,282 during CY16)
› Current Status
– CY17 represents first full year of official SURF reviews using published V&V guide
– Team recently kicked off effort to update existing V&V questions to align with the latest SRDR DID
– Co-chaired Collaborative Cost Research Group (CCRG) focused on increasing “Good” SRDR records (March 2017)
– Continued process improvement efforts to maintain efficient and effective process
– Working with DCARC to develop SURF User Interface within CADE
V&V Comments Have Significantly Improved SRDR Data Quality
Key Facts As of June 2017
52
SURF Comments by V&V Category
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
53
SURF Team Findings by V&V Category
SURF Process Summary & Initial Findings
› V&V comments are generated when SURF members answer a question with “No”
› Common trends for “No” responses:
– Reports not including all types of requirement counts (e.g., new, modified, inherited, deleted, cybersecurity, etc.)
– Reports not including COTS/GOTS “glue code” Software Lines of Code (SLOC) totals
– Reports not including SLOC counts using USC Unified Code Count (UCC) tool
– Reports not including software defect counts
– Reports not including a subset of required metadata (e.g., TD, EMD, Prototype, Service, Milestone, Contract Type, etc.)
› Common trends for “Yes” responses:
– Reports include a subset of metadata (e.g., contractor, contract number, report type, report POC, program name, etc.)
– Reports typically have SLOC counts broken out by new, modified, reuse, auto, and deleted
– Reports typically include primary language type designation
– Reports typically include “Effort” broken out by activity
› Common trends for “N/A” responses:
– Reports typically do not include “Forecast At Completion (FAC)” values
– Reports typically do not include non-SLOC sizing metrics (Function Points, RICE-FW, etc.)
– SURF analyst typically does not have access to corresponding CSDR plan (Working with CADE to develop SURF portal)
All Reviews (Mar – Dec 2016)
54
SURF V&V Guide
SURF Process Summary & Initial Findings
Currently, SURF members are updating or creating draft question lists to account for new DIDs for
Development, Maintenance, and ERP
› Updates to the development question lists include improvements to the list from lessons
learned over the previous year
Draft Question lists will then to be sent out to a larger SRDR-focused team members to ensure
questions list are reasonable and that they address quality data concerns
› Important to keep question lists to a reasonable size for continued SURF success
V&V guide and question templates to be updated to incorporate new questions as well as other
lessons learned
Updated V&V to larger SRDR working group and senior management for final comments/feedback
Send Updated V&V guide to OSD for final PAO approval and posting to CADE
Process for Update
55
SURF Summary
SURF Process Summary & Initial Findings
SURF is focused on improving data quality and helping support robust Government
review process
We would like to thank all of the DoD and Non-DoD individuals who have commented,
participated, and provided feedback throughout the past few years
Please feel free to use the contact information below if you would like more
information regarding SURF, the SRDR V&V Guide, or checklist
Marc Russo
Naval Center for Cost Analysis (NCCA)
NIPR: Marc.russo1@navy.mil
Ron Cipressi
Air Force Cost Analysis Agency
NIPR: Ronald.p.cipressi.civ@mail.mil
Nicholas Lanham
Naval Center for Cost Analysis (NCCA)
NIPR: Nicholas.lanham@navy.mil
Dan Strickland
Missile Defense Agency (MDA)
NIPR: Daniel.strickland@mda.mil
COST ASSESSMENT DATA ENTERPRISE
SURF Process Summary
Backup
5
6
Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.5.1.1 Does the submission clearly illustrate the number of Inherited, Added, Modified, Deleted, and Deferred requirements for both internal and external categories? 31 0 1 0
1.5.2.16
If COTS or GOTS items have been included within the submission, has the submitting organization provided the SLOC total required to integrate the identified COTS/GOTS product
(i.e. Glue code)?
28 0 4 0
1.5.1.2 Has the submitting organization separated the provided requirements by Security, Safety, and Privacy or Cybersecurity? 26 1 5 0
1.5.2.4
Did the submitter us the Aerospace-approved version of the University of Southern California (USC) Center for Systems and Software Engineering (CSSE) Unified Code Count (UCC)
tool to count the provided SLOC totals? If not, was the name of the code counting tool used by the submitting organization included within the supporting comments section
and/or data dictionary?
25 6 1 0
1.2.5 Is the system description been included within the submission? 24 8 0 0
1.2.2 Has the Major Defense Acquisition Program (MDAP) or Major Automated Information System (MAIS) designation been listed? 20 2 10 0
1.3.2.3 Has the state of development been identified (For example: Prototype, Production Ready, or a mix of the two)? 19 11 2 0
1.5.4.2 Has the priority level for each category of software defects been provided? 18 9 5 0
1.1.9
Is it clear if the information represents a Technology Demonstration (TD) or Engineering and Manufacturing Development (EMD) phase if the program is in that stage of
development?
17 8 7 0
1.5.2.13 Has the contractor or submitting organization provided the name of the software products that have been referenced to generate the provided reuse SLOC totals? 17 14 1 0
1.5.4.1 Has the submitting organization provided a breakout of the number of software defects Discovered, Removed, and Deferred? 17 10 5 0
1.7.2 Has the submitting organization clearly stated if the provided schedule data was reported as estimated, allocated, or actual results? 16 16 0 0
1.2.16 Is the specific U.S Military service branch or customer identified (For example: Navy, Air Force, Army, prime contractor, etc.)? 15 14 3 0
1.3.1.1 Does the SRDR submission, comments section, or data dictionary include a clear system level functional description and software operational overview? 15 17 0 0
1.2.6 Have the program phase and/or milestone been included within the report (for example: Pre-A, A, B, C-LRIP, C-FRP, O&S, etc.)? 14 18 0 0
1.3.2.1 Does the SRDR data dictionary include a clear system-level functional description and software operational overview? 14 17 1 0
1.2.19 Has the contract Period of Performance (PoP) been identified? 13 19 0 0
1.2.23
Does the submission include adequate detail within the comments section to support analysts who may reference the submission sometime in the future (For example: Provide
context for analyzing the provided data, such as any unusual circumstances that may have caused the data to diverge from historical norms)?
13 18 1 0
1.3.3.3 If an upgrade, does the SW sizing reflect significant reuse or modification SLOC totals when compared to New code? 13 16 2 1
1.4.5
Does the peak headcount make sense against the reported schedule and hours? A simple test is to divide the total reported hours by the schedule months and then convert the
resulting average monthly hours into an average Full Time Equivalent (FTE) count using the reported hours in a man-month. The peak headcount must be higher than this FTE
monthly average. At the same time the peak headcount should not be wildly disproportional to that average either.
13 17 1 1
1.5.2.10 Were code adaptation factors reported (percent redesign, recode, reintegration)? Do they appear to be unique for each CSCI, or are they standard rules of thumb? 13 4 15 0
1.2.17
Has the specific contract type been identified? For contracts, task orders, or delivery orders with multiple CLINs of varying contract types, the Contract Type reporting should be
the one associated with the plurality of cost.
12 20 0 0
1.2.22
Has the funding appropriation been identified (for example: Research, Development, Test and Evaluation (RDT&E), Procurement, Operation and Maintenance (O&M), Foreign
Military Sales (FMS), etc.)?
12 20 0 0
1.2.4 Has the Defense material item category been provided in accordance with MIL-STD-881C guidance (for example: Aircraft, radar, ship, Unmanned Ariel Vehicle (UAV) system)? 12 12 8 0
1.3.3.5 Has the development method also been identified (for example: Structured Analysis, Object Oriented, Vienna Development, etc.)? 12 20 0 0
1.6.2
Was effort data reported as estimated or actual results? If the submission includes estimated values and actual results, does the report include a clear and documented split
between actual results and estimated values?
12 18 2 0
V&V Questions Most Frequently With “No”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
57
Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.2.12 Is the contract number reported? 0 32 0 0
1.2.20 Has the report type been identified (for example: Initial, Interim, or Final)? 0 32 0 0
1.2.7 Has the contractor or organization that performed the work been identified? 0 32 0 0
1.2.21 Is there a single submission Point of Contact (POC) and supporting contact information included within the report? 3 29 0 0
1.7.1 Has schedule data been included in the submission? 3 29 0 0
1.7.4 Is schedule data broken out by SRDR activity? 3 29 0 0
1.1.2 Does the report reference the CSDR Plan? 4 28 0 0
1.2.1 Has the program name been identified? 4 28 0 0
1.5.2.1 Was the primary programming language reported? 4 28 0 0
1.5.2.6 Are the SLOC counts for different types of code (e.g., new, modified, reused, auto-generated, Government-furnished, and deleted) separated or are they mixed together? 4 28 0 0
1.6.3 Is the effort data reported in hours? 4 28 0 0
1.6.4 Is effort data broken out by activity? 4 28 0 0
1.6.5
Was the specific ISO 12207:2008 activities that are covered in the effort data (For example: Requirements analysis, architectural design, detailed design, construction, integration,
qualification testing, and support processes) clearly discernable?
4 28 0 0
1.1.6 Is there an easily identifiable event associated with the submission (for example: Contract Award, Build 2 Release, Build 1 Complete, Contract Complete, etc.)? 5 27 0 0
1.6.1 Was effort data reported for each CSCI or WBS? 5 27 0 0
1.2.14
Is the software process maturity and quality reporting definition provided (For example: Capability Maturity Model (CMM), Capability Maturity Model Integration (CMMI), or other
alternative rating)?
4 27 1 0
1.7.3 Has schedule data been reported in number of months from contract start or as calendar dates? 3 27 2 0
1.2.13 Are precedents reported and consistent from submission to submission? 1 27 4 0
1.3.3.4 What precedents or prior builds are identified to give credibility to the upgrade designation? 1 27 3 1
1.3.3.2 Has the contractor indicated whether the software is an upgrade or new development? If not, why not? 6 26 0 0
1.4.3 Does the data dictionary define what the skill level requirements are, and is the contractor adhering to that definition? 6 26 0 0
1.2.15 Is the Process Maturity rating reported with an associated date, and has it changed from a prior submission? 3 26 3 0
1.2.10 Has the contractor or submitting organization illustrated whether they were the primary or secondary developer? 7 24 1 0
1.6.6
Were common WBS elements/labor categories such as System Engineering (SE), Program Management (PM), Configuration Management (CM), or Quality Management (QM)
been broken out separately?
7 24 1 0
1.7.5
Does the report include unique schedule start and end date values? For example, do multiple records have the same schedule data, e.g., same calendar dates for multiple
WBS/CSCIs or builds?
7 24 1 0
1.2.3 Is the Prime Mission Product (PMP) name been clearly identified (for example: most current official military designation? 5 24 3 0
V&V Questions Most Frequently With “Yes”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
58
Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.5.3.2
If function points have been provided has the submitting organization clearly illustrated the function point count type (For example: Enhancement Project, Application, or
Development Project)?
0 0 32 0
1.5.3.3
Has the submitting organization provided the number of Data Functions and Transactional Functions (For example: Internal Logic Files, External Interface File, External Inquiries,
External Inputs, and External Outputs)?
0 0 32 0
1.5.3.4 Has the submitting organization included the Value Adjustment Factor? 0 0 32 0
1.5.3.5
If the submitting organization has provided sizing metrics using the Reports, Interfaces, Conversions, Extensions, Forms, and Workflows (RICE-FW) convention, has the complexity
of each RICE-FW category been provided?
0 0 32 0
1.8.1 FACH: Has a description been provided that describes which ISO 12207:2008 elements have been included within the provided total? 1 0 31 0
1.8.2 FACH: Do sub-element FAC values sum to the parent FAC total value? 1 0 31 0
1.8.3 If the report is a final report, does the provided ATD total match the provided FAC total? 1 0 31 0
1.1.1 Is the submission compliant with the CSDR Plan, i.e., a comparison of the submission to the plan requirement? 1 1 30 0
1.5.3.1
Were SLOC counts reported, or were other counting or sizing metrics used (e.g. function points, use cases, rung logic ladders, etc.)? If so, has the submitting organization obtained
the appropriate authorization to report non-SLOC based sizing within the corresponding CSDR plan?
0 3 29 0
1.6.17 If subcontractor hours have not been provided, did the reporting organization provide subcontractor dollars? 2 1 29 0
1.5.2.17
If COTS or GOTS integration or glue code has been included within the submission, does the total seem realistic when compared to the total SLOC included in the CSCI or WBS
element (For example: COTS integration code equals 500 KSLOC and the total SLOC for the specific CSCI or WBS element equals 150 KSLOC)? note: this scenario sometime occurs
when the submitting organization counts the total SLOC of the specified COTS or GOTS product vice the integration or glue code required to integrate the product.
3 0 28 1
1.6.9
Do the children or lower-level WBS/CSCI elements add up to the parent? If not, is there effort that is only captured at a higher-level WBS/CSCI level that should be allocated to the
lower-level WBS/CSCI elements?
5 3 24 0
1.2.18 Has the total contract price been identified? 9 0 23 0
1.5.1.3
Do the number of requirements trace from the parent to the children in the WBS? If not, this could imply that some portion of the software effort is only captured at higher-level
WBS/ CSCI elements and should be cross checked.
3 4 23 2
1.5.2.9
For a Final report does the size look realistic? For example: is all of the code rounded to the nearest 1000 lines, or does the dictionary indicate that they had difficulty counting
code that may have come from a subcontractor?
1 9 22 0
1.1.7
If there are prior submissions, is this submission an update to a prior submission or a new event? If the submission is an update to an existing submission, does the latest
submission clearly describe what report the prior submission is linked to?
2 8 21 1
1.2.11 If effort was outsourced, has the outsourced organization been provided? 4 7 21 0
1.6.7 Is there an explanation of missing activities included within the supporting SRDR data dictionary? 7 4 21 0
1.1.8 If a prior submissions exists, is the information that has changed readily identifiable and a reason for the change provided (either in the data dictionary or comments section)? 5 6 20 1
1.4.4 Does the skill mix make sense relative to the complexity of the code (unusual amount of very low or very high skill mix, for example)? 0 12 20 0
1.5.4.3
If the report is an interim or final submission, has the number of Discovered, Removed, and Deferred defects changed from the previous submission? If significant changes have
occurred, does the supporting comments section and/or data dictionary provide details regarding what drove the significant change in product quality metrics?
10 2 20 0
1.6.12
Does the submission include unique values for each of the lower-level CSCI or WBS elements? For example, do multiple related records have the same effort data (i.e. activity
effort is repeated or total effort is repeated)?
6 6 20 0
1.4.2 If there was a prior submission, has the skill mix changed dramatically and, if so, is there an explanation why? Conversely, did it remain unchanged? If so, why? 8 4 19 1
1.5.2.14 When subcontractor code is present, is it segregated from the prime contractor effort, and does it meet the same criteria for quality as the prime’s code count? 6 7 19 0
V&V Questions Most Frequently With “N/A”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
59

More Related Content

Similar to SRDR Software Reporting DID Training - Nov. 2017.pptx

Software design specification
Software design specificationSoftware design specification
Software design specification
SubhashiniSukumar
 
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
Paulo Lacerda
 
Software Requirement Specification (SRS) on Result Analysis Tool
Software Requirement Specification (SRS) on Result Analysis ToolSoftware Requirement Specification (SRS) on Result Analysis Tool
Software Requirement Specification (SRS) on Result Analysis Tool
Minhas Kamal
 
Visualizing operational activities through dashboards
Visualizing operational activities through dashboardsVisualizing operational activities through dashboards
Visualizing operational activities through dashboards
WSO2
 
Sudheer_SAP_ABAP_Resume
Sudheer_SAP_ABAP_ResumeSudheer_SAP_ABAP_Resume
Sudheer_SAP_ABAP_Resume
Sudheer babu
 
Chapter 11 Metrics for process and projects.ppt
Chapter 11  Metrics for process and projects.pptChapter 11  Metrics for process and projects.ppt
Chapter 11 Metrics for process and projects.ppt
ssuser3f82c9
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar M
Manoj Kumar
 
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ AgileTejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
Tejaswi Desai
 
Enterprise Agreement
Enterprise AgreementEnterprise Agreement
Enterprise Agreement
Sagi Arsyad
 
PROJECT FAST INVENTORY Delivere.docx
PROJECT FAST INVENTORY  Delivere.docxPROJECT FAST INVENTORY  Delivere.docx
PROJECT FAST INVENTORY Delivere.docx
woodruffeloisa
 
Sap seminar prince
Sap seminar princeSap seminar prince
Sap seminar prince
Prince Bhanwra
 
Sap seminar prince
Sap seminar princeSap seminar prince
Sap seminar prince
Prince Bhanwra
 
Sagar Resume
Sagar ResumeSagar Resume
Sagar Resume
SagarGupta309
 
Vinay_Patange_StanChart_Cannes_2010_Ver2
Vinay_Patange_StanChart_Cannes_2010_Ver2Vinay_Patange_StanChart_Cannes_2010_Ver2
Vinay_Patange_StanChart_Cannes_2010_Ver2
Vinay Patange
 
Software metrics
Software metricsSoftware metrics
Software metrics
syeda madeha azmat
 
Lean product management for web2.0 by Sujoy Bhatacharjee, April
Lean product management for web2.0 by Sujoy Bhatacharjee, April Lean product management for web2.0 by Sujoy Bhatacharjee, April
Lean product management for web2.0 by Sujoy Bhatacharjee, April
Triggr In
 
Managing IT Projects
Managing IT ProjectsManaging IT Projects
Managing IT Projects
Rhys Leong
 
rough-work.pptx
rough-work.pptxrough-work.pptx
rough-work.pptx
sharpan
 
Blue book
Blue bookBlue book
Golive Presentation.pptx
Golive Presentation.pptxGolive Presentation.pptx
Golive Presentation.pptx
Sridhar
 

Similar to SRDR Software Reporting DID Training - Nov. 2017.pptx (20)

Software design specification
Software design specificationSoftware design specification
Software design specification
 
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...
 
Software Requirement Specification (SRS) on Result Analysis Tool
Software Requirement Specification (SRS) on Result Analysis ToolSoftware Requirement Specification (SRS) on Result Analysis Tool
Software Requirement Specification (SRS) on Result Analysis Tool
 
Visualizing operational activities through dashboards
Visualizing operational activities through dashboardsVisualizing operational activities through dashboards
Visualizing operational activities through dashboards
 
Sudheer_SAP_ABAP_Resume
Sudheer_SAP_ABAP_ResumeSudheer_SAP_ABAP_Resume
Sudheer_SAP_ABAP_Resume
 
Chapter 11 Metrics for process and projects.ppt
Chapter 11  Metrics for process and projects.pptChapter 11  Metrics for process and projects.ppt
Chapter 11 Metrics for process and projects.ppt
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar M
 
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ AgileTejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ Agile
 
Enterprise Agreement
Enterprise AgreementEnterprise Agreement
Enterprise Agreement
 
PROJECT FAST INVENTORY Delivere.docx
PROJECT FAST INVENTORY  Delivere.docxPROJECT FAST INVENTORY  Delivere.docx
PROJECT FAST INVENTORY Delivere.docx
 
Sap seminar prince
Sap seminar princeSap seminar prince
Sap seminar prince
 
Sap seminar prince
Sap seminar princeSap seminar prince
Sap seminar prince
 
Sagar Resume
Sagar ResumeSagar Resume
Sagar Resume
 
Vinay_Patange_StanChart_Cannes_2010_Ver2
Vinay_Patange_StanChart_Cannes_2010_Ver2Vinay_Patange_StanChart_Cannes_2010_Ver2
Vinay_Patange_StanChart_Cannes_2010_Ver2
 
Software metrics
Software metricsSoftware metrics
Software metrics
 
Lean product management for web2.0 by Sujoy Bhatacharjee, April
Lean product management for web2.0 by Sujoy Bhatacharjee, April Lean product management for web2.0 by Sujoy Bhatacharjee, April
Lean product management for web2.0 by Sujoy Bhatacharjee, April
 
Managing IT Projects
Managing IT ProjectsManaging IT Projects
Managing IT Projects
 
rough-work.pptx
rough-work.pptxrough-work.pptx
rough-work.pptx
 
Blue book
Blue bookBlue book
Blue book
 
Golive Presentation.pptx
Golive Presentation.pptxGolive Presentation.pptx
Golive Presentation.pptx
 

Recently uploaded

June Patch Tuesday
June Patch TuesdayJune Patch Tuesday
June Patch Tuesday
Ivanti
 
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development ProvidersYour One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
akankshawande
 
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframeDigital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Precisely
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
saastr
 
Y-Combinator seed pitch deck template PP
Y-Combinator seed pitch deck template PPY-Combinator seed pitch deck template PP
Y-Combinator seed pitch deck template PP
c5vrf27qcz
 
Nordic Marketo Engage User Group_June 13_ 2024.pptx
Nordic Marketo Engage User Group_June 13_ 2024.pptxNordic Marketo Engage User Group_June 13_ 2024.pptx
Nordic Marketo Engage User Group_June 13_ 2024.pptx
MichaelKnudsen27
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Alpen-Adria-Universität
 
HCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAUHCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAU
panagenda
 
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
Chart Kalyan
 
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
saastr
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Safe Software
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
Javier Junquera
 
What is an RPA CoE? Session 1 – CoE Vision
What is an RPA CoE?  Session 1 – CoE VisionWhat is an RPA CoE?  Session 1 – CoE Vision
What is an RPA CoE? Session 1 – CoE Vision
DianaGray10
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
panagenda
 
Taking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdfTaking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdf
ssuserfac0301
 
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsConnector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
DianaGray10
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
Pablo Gómez Abajo
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
Hiroshi SHIBATA
 

Recently uploaded (20)

June Patch Tuesday
June Patch TuesdayJune Patch Tuesday
June Patch Tuesday
 
Artificial Intelligence and Electronic Warfare
Artificial Intelligence and Electronic WarfareArtificial Intelligence and Electronic Warfare
Artificial Intelligence and Electronic Warfare
 
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development ProvidersYour One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
 
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframeDigital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
 
Y-Combinator seed pitch deck template PP
Y-Combinator seed pitch deck template PPY-Combinator seed pitch deck template PP
Y-Combinator seed pitch deck template PP
 
Nordic Marketo Engage User Group_June 13_ 2024.pptx
Nordic Marketo Engage User Group_June 13_ 2024.pptxNordic Marketo Engage User Group_June 13_ 2024.pptx
Nordic Marketo Engage User Group_June 13_ 2024.pptx
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
 
HCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAUHCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAU
 
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
 
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
 
What is an RPA CoE? Session 1 – CoE Vision
What is an RPA CoE?  Session 1 – CoE VisionWhat is an RPA CoE?  Session 1 – CoE Vision
What is an RPA CoE? Session 1 – CoE Vision
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
 
Taking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdfTaking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdf
 
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsConnector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectors
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
 

SRDR Software Reporting DID Training - Nov. 2017.pptx

  • 1. COST ASSESSMENT DATA ENTERPRISE Software Resource Data Reporting (SRDR)
  • 2. 2 Overview Software Resource Data Reporting (SRDR) Resources Data Item Description (DID): DI-MGMT-82035 Development, Format 1: DD Form 3026-1 Maintenance, Format 2: DD Form 3026-2 Enterprise Resource Planning (ERP): DD Form 3026-3 Intent of SRDR is to collect objective and measurable data commonly used by industry and DoD cost analysts Builds a repository of estimated and actual: › SW product sizes › Schedules › Effort › Quality
  • 3. 3 New Formats Software Resource Data Reporting (SRDR) Technical Data SW size, context, technical information Release level and computer SW configuration item (CSCI) level sections Effort Data Reports SW efforts associated with each reported release and CSCI Technical Data SW size, context, technical information Top level and release level sections Effort Data Reports the to-date SW maintenance efforts for each in- progress and completed release(s), and total maintenance activities Technical Data Provides context, SW product, Development Team, etc. Effort Data Project resource and schedule information at the release level Part 1 Part 2 Development Maintenance ERP
  • 4. 4 Updates Software Resource Data Reporting (SRDR) No Additional Significant Updates to Format 1 and 2 Format 1 Updates Introduction of Agile Measures › Supports capturing the LOE with this SW dev methodology Instructions captured within DID Format 2 Updates Clarified SW change count definitions › Confusion in original DID regarding inclusion of Information Assurance Vulnerability Alerts (IAVAs) › Updated instructions to ensure IAVA changes are mutually exclusive from Priority 1-5 changes, and are not double counted Moved SW License Management Hours reporting to Project Management (PM) › SW License Management is a PM activity Development Maintenance
  • 5. 5 Submission Event Process Flow Diagram Software Resource Data Reporting (SRDR) Event 1 Initial Contract Initial Event 2 Interim Event 3 Interim Event 4 Interim Complete Event 5 Interim Release 1 Release Initial Release 2 Contract . Contract Interim Release Interim Release 3 Release Final Contract Final No Change from previous report Release 1 Release 2 Release 2 Release 3 Release 4 This reflects what is contained in each individual submission This reflects the individual submissions Estimate In-Process/Actuals Final/Actuals Final/Actuals Final/Actuals Estimate Estimate Estimate Estimates Estimate In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals Release 4 All Release Reports Due
  • 6. COST ASSESSMENT DATA ENTERPRISE Software Resource Data Reporting (SRDR) Presented by: Paul Kopicki, paul.a.kopicki.civ@mail.mil
  • 7. 7 AGENDA Software Resource Data Report (SRDR) › Software DID Overview (Development/Maintenance/ERP) › Software Data DID CRM Review › Software Data DID Path Forward
  • 9. 9 Software DID Overview Software DID Overview (Development/Maintenance/ERP) › Software Resources Data Reporting: • Data Item Description: DI-MGMT-82035 • Development, Format 1: DD Form 3026-1 • Maintenance Report, Format 2: DD Form 3026-2 • Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
  • 10. 10 Data Item Description (DID): DI-MGMT-82035 Software DID Overview (Development/Maintenance/ERP) › DID summarizes the Software (SW) Development Report, Maintenance Report, and ERP SW Development Report › Provides instructions to support data and Frequency requirements specified in the CSDR reporting › Intent of SRDR is to collect objective, measurable data commonly used by industry and DoD cost analysts › Builds a repository of estimated and actual: • SW product sizes • Schedules • Effort • Quality
  • 11. 11 Development, Format 1: DD Form 3026-1 Software DID Overview (Development/Maintenance/ERP) Consists of two parts Part 1 – SW Development Technical Data › SW size, context, technical information › Release Level and Computer SW Configuration Item (CSCI) Level sections Part 2 – SW Development Effort Data › Reports SW efforts associated with each reported release and CSCI
  • 12. Development, Format 1: DD Form 3026-1 Software DID Overview (Development/Maintenance/ERP) 12 Format 1 updates: › Introduction of Agile Measures › Instructions captured within DID ex: AAAA 1.x.x 5 3 15 10 200 180 Release Map SECTION 3.3.2.6.5 Planned and Achieved Development (by Feature per Epic) SECTION 3.3.2.6.6 Epic/ Capability Identifier Feature Identifier Planned Stories per Feature by Epic Actual Stories Planned Story Points per Feature by Epic Actual Story Points Planned Hours per Feature by Epic Actual Feature Hours (NOTE: Insert rows as needed to account for all Features and Epics mapped to this Release.) Summary Totals for This Release SECTION 3.3.2.6.7 (Sum of all rows with data by Feature by Epic) Quantity Actually Developed Quantity Planned Item Agile Measures Based: SECTION 3.3.2.6.4 Days per Sprint: Days per Release: Total Features Total Epics/ Capabilities Total Stories Total Story Points Total Feature Hours Total Sprints No Additional Significant Updates to Format 1
  • 13. 13 Maintenance Report, Format 2: DD Form 3026-2 Software DID Overview (Development/Maintenance/ERP) Consists of two parts Part 1 – SW Maintenance Technical Data › SW size, context, technical information › Top Level and Release Level sections Part 2 – SW Maintenance Effort Data › Reports the to-date SW maintenance efforts for each in-progress and completed release(s), and total maintenance activities
  • 14. 14 Maintenance Report, Format 2: DD Form 3026-2 Software DID Overview (Development/Maintenance/ERP) Format 2 updates › Clarified SW change count definitions • Confusion in original DID regarding inclusion of Information Assurance Vulnerability Alerts (IAVAs) • Updated instructions to ensure IAVA changes are mutually exclusive from Priority 1-5 changes, and are not double counted › Moved SW License Management Hours reporting to Project Management (PM) • SW License Management is a PM activity No Additional Significant Updates to Format 1
  • 15. 15 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) SW Development Report specifically for ERP or Defense Business System programs › Financial › Personnel › Inventory Consists of two parts Part 1 – System Technical Data › Provides context, SW product, Development Team, etc. Part 2 – SW Development Effort Data › Project resource and schedule information at th Release Level
  • 16. 16 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) Part 1 – Technical Data A. Report Context 3.3.1 Description of Actual Development Organization B. Product and Team Description 3.3.2 Microcode and Firmware Communication Software Tools Signal Processing System Software Mission Planning Vehicle Payload Process Control Custom AIS Software Vehicle Control Scientific and Simulation Enterprise Service System Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System Command and Control Training 10. Primary Application Domain comments 3.3.2.3 11. Comments on Subsection B responses: 3.3.2.4 3. Certification Date: 3.3.1.4 4. Lead Evaluator of CMM Certification: 3.3.1.5 5. Affiliation to Development Organization: 3.3.1.6 ENTERPRISE RESOURCE PLANNING (ERP) SOFTWARE RESOURCES DATA REPORTING, FORMAT 3: PART 1 Software Development Technical Data (Project) SECTION 3.3 1. Release Name: 3.3.1.1 Release ID: 3.3.1.2 2. Certified CMM Level (or equivalent): 3.3.1.3 8. Project Functional Description: 3.3.2.1 9. Primary Application Domains for ERP Programs 3.3.2.2 6. Precedents (list up to ten similar systems completed by the same organization or team): 3.3.1.7 7. Comments on Subsection A responses: 3.3.1.8
  • 17. 17 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) Part 1 – Technical Data
  • 18. 18 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) Part 2 – SW Development Effort Data
  • 19. 19 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) Part 2 – SW Development Effort Data
  • 20. 20 Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3 Software DID Overview (Development/Maintenance/ERP) Part 2 – SW Development Effort Data
  • 22. SRDR DID & Forms CRM Review Software Data DID CRM Review 22 Document # of Comments Taking Action No Action General SRDR DID/Form 20 7 13 Maintenance Form 11 7 4 ERP Form 6 2 4 37 16 21 7 8 2 4 2 10 4 0 2 4 6 8 10 12 14 16 18 20 DASA-CE LM SSC-SEIT MDA/CP NAVAIR Taking Action No Action 11 2 18 6 37 Comments Several rounds of comments through Gov. and Industry Last round of DID and Forms sent in April 17; Reviewed/Adjudicated comments with SRDR WG through early May › Open comments coordinated with developers and closed out with POCs end of May
  • 23. SRDR DID & Forms CRM Review Software Data DID CRM Review 23 7 8 2 4 2 10 4 0 2 4 6 8 10 12 14 16 18 20 DASA-CE LM SSC-SEIT MDA/CP NAVAIR Taking Action No Action 11 2 18 6 37 Comments Major DID Changes since May 2017 › Agile metrics reporting • From: All agile programs instructed to use ERP instructions and submit agile data using Format 3 • To: Format 1 was updated to include Agile table too › New version of Figure 1 to clarify intent
  • 25. 25 Agile Measures Software Data DID Backup For clarify: Cross references from Form 1 to Form 3 were eliminated by moving the Agile Measures section into Form 1 Comment Proposed Solution For clarity and consistency, update the Traditional Development Form and DID language to include the Agile Measures section that is currently found within the ERP section of the DID. Updated the Forms and DID to duplicate the ERP agile measures language and requirement within the Development form. Agile measures is now reflected on both forms and both sections in the DID
  • 26. 26 Submission Event Process Flow Diagram Software Data DID Backup Comment Proposed Solution Text contradicts the process flow diagram (Figure 1) found within the DID. Provide clarification to update the text and image, so it is in alignment with reporting (initial, interim, final) deliverables and expectations. Updated Figure 1 and associated text. Notional submission event diagram and associated text was confusing To avoid confusion, Figure 1 was updated (next slide)
  • 27. 27 Submission Event Process Flow Diagram Software Data DID Backup Event 1 Initial Contract Initial Event 2 Interim Event 3 Interim Event 4 Interim Complete Event 5 Interim Release 1 Release Initial Release 2 Contract . Contract Interim Release Interim Release 3 Release Final Contract Final No Change from previous report Release 1 Release 2 Release 2 Release 3 Release 4 This reflects what is contained in each individual submission This reflects the individual submissions Estimate In-Process/Actuals Final/Actuals Final/Actuals Final/Actuals Estimate Estimate Estimate Estimates Estimate In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals In-Process/Actuals Release 4 All Release Reports Due
  • 28. DID Comments Summary Software Data DID Backup 28 Of the 37 comments…. › Comments bucketed into 7 categories › Majority of comments were for additional Clarification or Questions › Comments were not received differentiating between critical, substantive, or administrative 18 10 3 3 1 1 1 0 2 4 6 8 10 12 14 16 18 20 # of Comments Clarification……………………….…………. 18 Question………………………………….…… 10 Administrative…………………………..…. 3 Process Diagram…………………………... 3 Duplication Comment…………………… 1 Not Actionable………………….………….. 1 General..…………..……………..…………… 1 37 Comments
  • 29. 29 Development, Format 1: DD Form 3026-1, Initial Report, Common Heading- Faked Software Data DID Backup PRIME MISSION PRODUCT AH-95 Karuk PHASE/MILESTONE: REPORTING ORGANIZATION TYPE Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR GOVERNMENT PERFORMING ORGANIZATION DIVISION a. NAME: Stark Industries a. NAME: Stark Aircraft b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd. Orlando, FL Orlando, FL APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01 REPORT TYPE INITIAL X INTERIM FINAL SUBMISSION NUMBER 2 X RDT&E RESUBMISSION NUMBER 0 PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930 X O&M DATE PREPARED (YYYYMMDD) DEPARTMENT REMARKS: 20121130 b. END DATE (YYYYMMDD): c. SOLICITATION NO.: N/A b. MODIFICATION NO.: 1 APPROPRIATION N/A 20170930 TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001 PERIOD OF PERFORMANCE 20120501 a. START DATE (YYYYMMDD): Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS UNCLASSIFIED SECURITY CLASSIFICATION All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS elements. Appropriation is is both RDT&E and O&M (same as associated CSDR) Software Development Report UNCLASSIFIED SECURITY CLASSIFICATION SOFTWARE RESOURCES DATA REPORTING: Metadata MAJOR PROGRAM NAME: AH-95 Karuk The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09, Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION. OMB Control Number 0704-0188 Expiration Date: 8/31/2016
  • 30. 30 Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked Software Data DID Backup 1.0 Release Start Date 20120501 Release End Date Hours Per Staff Month Computed ? No 1.1.2.1 Mission Computer Update Definition ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X 1.1.2.1 Smart Display Definition ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X 1.2.1 Software Systems Engineering Definition ISO 12207 Processes included… SW RA X SW AD X SW DD SW Con SW I SW QT SW SP 1.3.1 Software Program Management Definition ISO 12207 Processes included… SW RA X SW AD SW DD X SW Con X SW I SW QT X SW SP 1.4.1.1 Development Test and Evaluation (SW-Specific) Definition N/A - Regression Testing is included as part of the Development Activities ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP 1.5.1 Training (SW-Specific) Definition N/A - Training is part of Program Management ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP 1.6.2.1 Engineering Data (SW-Specific) Definition N/A - Data is provided in the comments of the Mission Computer Update ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP Contractor-Defined Software Development Activities LEAD EVALUATOR CMMI Level 3 CERTIFICATION DATE SOFTWARE PROCESS MATURITY 20110923 The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It provides the central computation and control facility that integrates the avionics suite into a cohesive system. The Mission Computer DEVELOPMENT ORGANIZATION CMMI Institute James Rhodes EVALUATOR AFFILIATION SOFTWARE DEVELOPMENT REPORT, FORMAT 1: Part 1 Software Development Technical Data, Release Level Release ID Release Name Mission Computer Update 2018 Baseline 20130930 Software Requirements Count Definition Total Labor Hours Total Staff 135000 The Software Requirements count is defined by counting the number of "Shall" statements in the latest Software Requirements Specification (SRS). This includes func External Interface Requirements Count Definition The External Interface Requirements count is defined by counting the number of unique Interface statements in each Subsystem Interface Description Document (IDD). 154 112 Software-Specific Common Elements (as defined in the CSDR Plan) Product Quality Reporting Definition The Smart Display (SD) is a software application that provides flight instrument graphics. It is used in cockpit display systems to support flight and mission activities. Program Management performed by Stark Industries in Orlando, FL consists of team management and training utilizing spiral development strategies through the life of the program Systems Engineering performed by Stark Industries in Orlando, FL consists of software requirements and initial design efforts
  • 31. 31 Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked Software Data DID Backup Aerospace UCC Version Alternate Code Counter Description DD Form3026-1, MAY 2016 UNCLASSIFIED SECURITY CLASSIFICATION RELEASE-LEVEL COMMENTS The precedents listed above are similar projects in that they all share the engineers and development team. They also use the same development, lab environments and software tools. Alternate Code Counter Comparison to UCC Out of a comparison of 381 source files, the UCC tool counted 1.3% more logical lines of source code than Code Counter Plus. Third-Party Code Counter N/A Alt Code Counter Version 2.0 Flight Armor Software Mark 3 Automobile Flight Software Mark 21 Personal Helicopter Mission Computer Alternate Code Counter Name Code Counter Plus SYSTEM DESCRIPTION The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air Combat. This software is an update to the Mission Computer for the helicopter. PRECEDENTS (List at least three similar systems.)
  • 32. 32 Development, Format 1: DD Form 3026-1, Initial Report, CSCI Level - Faked Software Data DID Backup Product and Development Description Functional Description Software Development Characterization Software State of Development (Check one only ) Prototype Production-Ready X Mix Operating Environment(s) (Check all that apply ) Surface Fixed Surface Vehicle Ordnance Systems Other Surface Mobile X Air Vehicle Missile Systems If Other, provide explanation Surface Portable Sea Systems Space Systems X Manned Unmanned Primary Application Domain (Check one only ) Microcode and Firmware Communication Software Tools Signal Processing X System Software Mission Planning Vehicle Payload Process Control Custom AIS Software Vehicle Control Scientific and Simulation Enterprise Service System Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System Command and Control Training Application Domain Comments Development Process Software Development Method This product is a new development No This is a modification of an existing Mission Computer N/A - The program is currently in requirements development phase; Maintenance will begin immediately upon deployment The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It provides the central computation and control facility that integ Spiral The methodology used for Software Development is the standard Spiral development process that has been used previously. The Software Development team will receive software requireme
  • 33. 33 Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked Software Data DID Backup Software Reuse Name Description Name Description etc. COTS / GOTS / Open-Source Applications Name Glue Code Estimated Configuration Effort Estimated Name Glue Code Estimated Configuration Effort Estimated etc. COTS/ GOTS Comments STAFFING (Initial and Interim Reports only ) Peak Staff Peak Staff Date Product and Development Description Comments Product Size Reporting Software Requirements Count Total 23854 Inherited Requirements 13502 4999 Modified Requirements 0 Deleted Requirements 0 Deferred Requirements 1535 Requirements Volatility Certification and Accreditation Rqts Security Requirements 3250 Safety Requirements 568 Privacy Requirements 0 Software Requirements Comments No requirements are implemented at this time, as this is the Initial Report and no work has yet been done. External Interface Requirements Count Total 1585 Inherited Requirements 950 270 Modified Requirements Deleted Requirements Deferred Requirements 335 Requirements Volatility Certification and Accreditation Rqts Security Requirements 30 Safety Requirements Privacy Requirements External Interface Requirements Comments The previous Mission Computer will be the baseline for the Upgrade Mission Computer Added/New Requirements Added/New Requirements 167 20130101 This is an Upgrade to the existing Mission Computer, so while there will be new development, it is not a new program No requirements are implemented at this time, as this is the Initial Report and no work has yet been done. N/A
  • 34. 34 Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked Software Data DID Backup SLOC-Based Software Size Primary Language (L1) Secondary Language (L2) L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other 0 0 0 0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF% 0 0 0 0 0 0 0% 0% 0% 0% 0 0 0 766321 105230 0 0% 0% 766321 105230 0% 0% 0 0 0 0 0 0 0% 0% 0% 0% 766321 105230 0 766321 105230 0 0 0 0 0 0 0 L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other 135268 1795 0 135268 1795 0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF% 0 0 0 0 0 0 0% 0% 0% 0% 0 0 0 766321 105230 0 0% 0% 766321 105230 0% 0% 0 0 0 0 0 0 0% 0% 0% 0% 901589 107025 0 901589 107025 0 0 0 0 0 0 0 AMOUNT OF DELIVERED CODE DEVELOPED NEW ACTUAL COUNTS TO DATE (Interim and Final Reports only ) WITH MODIFICATIONS WITHOUT MODIFICATIONS AMOUNT OF DELIVERED CODE DEVELOPED NEW AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS RELEASE) AMOUNT OF GOV'T FURNISHED CODE AMOUNT OF DELETED CODE AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT CARRYOVER FROM PREVIOUS RELEASE) WITH MODIFICATIONS WITHOUT MODIFICATIONS AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS RELEASE) TOTAL DELIVERED CODE AMOUNT OF DELETED CODE WITH MODIFICATIONS AUTO GENERATED HUMAN GENERATED AUTO GENERATED WITH MODIFICATIONS WITHOUT MODIFICATIONS WITH MODIFICATIONS WITHOUT MODIFICATIONS AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT CARRYOVER FROM PREVIOUS RELEASE) HUMAN GENERATED AMOUNT OF GOV'T FURNISHED CODE WITHOUT MODIFICATIONS TOTAL DELIVERED CODE ESTIMATES AT COMPLETION (Initial and Interim Reports only ) WITH MODIFICATIONS WITHOUT MODIFICATIONS
  • 35. 35 Maintenance Report, Format 2: DD Form 3026-2, Initial Report Common Heading - Faked Software Data DID Backup PRIME MISSION PRODUCT AH-95 Karuk PHASE/MILESTONE: REPORTING ORGANIZATION TYPE Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR GOVERNMENT PERFORMING ORGANIZATION DIVISION a. NAME: Stark Industries a. NAME: Stark Aircraft b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd. Orlando, FL Orlando, FL APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER N/A d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01 REPORT TYPE INITIAL X INTERIM FINAL SUBMISSION NUMBER 2 X RDT&E RESUBMISSION NUMBER 0 PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930 X O&M DATE PREPARED (YYYYMMDD) DEPARTMENT REMARKS: UNCLASSIFIED SECURITY CLASSIFICATION SOFTWARE RESOURCES DATA REPORTING: Metadata MAJOR PROGRAM NAME: AH-95 Karuk The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09, Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION. OMB Control Number 0704-0188 Expiration Date: 8/31/2016 TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001 b. MODIFICATION NO.: 1 c. SOLICITATION NO.: N/A PERIOD OF PERFORMANCE APPROPRIATION a. START DATE (YYYYMMDD): 20120501 b. END DATE (YYYYMMDD): 20121130 POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS 20170930 UNCLASSIFIED SECURITY CLASSIFICATION Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS elements. Appropriation is is both RDT&E and O&M (same as associated CSDR)
  • 36. 36 Maintenance Report, Format 2: DD Form 3026-2, Initial Report Part 1 - Faked Software Data DID Backup System Description No. of Unique Baselines Maintained 1 - Mission Computer Update 2018 Baseline No. of Total Hardware Platforms This Software Operates On X Extensive Regular Event Driven Other If Other, provide explanation N/A MAINTENANCE ORGANIZATION James Rhodes CERTIFICATION DATE 20110923 LEAD GOVERNMENT ORGANIZATION LOCATION (Enter the lead government maintenance location) Software Requirements Count Definitions External Interface Requirements Count Definitions The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air Combat. This software is an update to the Mission Computer for the helicopter. 4. The Mission Computer (MC) Block C and D hardware. The Flight Management Computer (FMC) Block B and C hardware. PRECEDENTS (List at least three similar systems by the same organization or team.) Flight Armor Software Mark 3 Automobile Flight Software Mark 21 Personal Helicopter Mission Computer GOVERNMENT ORGANIZATION NAME Army Aviation Command Suite 235 BLDG H-21 Huntsville, AL The Software Requirements count is defined by counting the number of "Shall" statements in the latest Software Requirements Specification (SRS). This includes functional, non-functional, security, safety, privacy and cyber security requirements. The External Interface Requirements count is defined by counting the number of unique Interface statements in each Subsystem Interface Description Document (IDD). This includes functional, non-functional, security, safety, privacy and cyber security requirements. SOFTWARE MAINTENANCE REPORT, FORMAT 2: Top Level PART 1 Software Maintenance Technical Data Operation Tempo (check one) SOFTWARE PROCESS MATURITY CMMI Maturity Level 3 LEAD EVALUATOR EVALUATOR AFFILIATION CMMI Institute
  • 37. COST ASSESSMENT DATA ENTERPRISE SURF Process Summary & Initial Findings: A Deeper Focus on Software Data Quality This document was generated as a result of the AFCAA-led, Software Resource Data Report Working Group (SRDRWG). This working group represented a joint effort amongst all DoD service cost agencies. The following guidance describes SRDR data verification and validation best practices as documented by NCCA, NAVAIR 4.2, AFCAA, ODASA-CE, MDA, and many more. Presented by: Ranae Woods, AFCAA, ranae.p.woods.civ@mail.mil Dan Strickland, MDA , daniel.strickland@mda.mil Nicholas Lanham, NCCA, nicholas.lanham@navy.mil Marc Russo, NCCA, marc.russo1@navy.mil Haset Gebre-Mariam, NCCA, haset.gebremariam@navy.mil
  • 38. 38 Table of Contents SURF Process Summary & Initial Findings › Purpose › SRDR Need Statement › SURF Purpose › SURF Created Process Initiation › SURF Team Structure › SURF Verification & Validation (V&V) Guide › SRDR V&V Process › SRDR Database › SRDR Data Quality Review › SURF Status and Metrics › Summary
  • 39. 39 Presentation Purpose SURF Process Summary & Initial Findings › To familiarize the audience with recent Software Resource Data Report (SRDR) Working Group (WG) efforts to update existing SRDR DID language and implement data quality improvement › To clarify how these SRDRWG efforts led to the development of a SRDR Unified Review Function (SURF) team › To highlight: − SURF mission − Highlight SURF team and Verification and Validation (V&V) guide positive impact on SRDR data quality
  • 40. 40 SURF Need Statement SURF Process Summary & Initial Findings Why do these reports need to be reviewed? › Reduces inaccurate use of historical software data – Aligns with OSD CAPE initiative(s) to improve data quality › Helps correct quality concerns prior to final SRDR acceptance › Allows a central group of software V&V SMEs to tag SRDR data › SRDR submissions are used by all DoD cost agencies when developing or assessing cost estimates › Quality data underpins quality cost and schedule estimates BBP Principle 2: Data should drive policy. Outside my door a sign is posted that reads, "In God We Trust; All Others Must Bring Data." The quote is attributed to W. Edwards Deming - Mr. Frank Kendall, AT&L Magazine Article, January-February 2016
  • 41. Question: What services helped develop the questions included within the latest SRDR V&V guide? Answer: All services participating in the SRDR WG provided feedback, comments, and reviews over a year long SRDRWG effort focused on establishing higher quality review efforts coupled with an ongoing SRDR DID update 41 SURF Purpose SURF Process Summary & Initial Findings Purpose › To supplement the Defense Cost Resource Center (DCARC) quality review for SRDR submissions › To develop a consistent, service-wide set of quality questions for all DoD cost community members to reference › To provide a consistent, structured list of questions, focus areas, and possible solutions to cost community members tasked with inspecting SRDR data submissions for completeness, consistency, quality, and usability (e.g. SRDR V&V Guide) Why? › SURF represents an effort to establish a consistent guide for any organization assessing the realism, quality, and usability of SRDR data submissions › Quality data underpins quality cost and schedule estimates
  • 42. Question: How was the SURF team created and is it linked to the SRDRWG? Answer: Yes. The SRDR Unified Review Function (SURF) team was organized as part of the larger, SRDRWG initiative during 2015 42 How Was SURF Created? SURF Process Summary & Initial Findings 1. Revised SRDR Development Data Item Description (DID) 2. New SRDR Maintenance Data Item Description (DID) 3. Joint Validation & Verification (V&V) Guide, Team, and Process 4. Software Database Initial Design and Implementation Process 1. Reduces inconsistency, lack of visibility, complexity, and subjectivity in reporting 2. Aligned w/ dev. but w/ unique data/metrics available/desired for maintenance phase 3. Higher quality, less duplication - ONE central vs many distributed; 1 joint team & guide gives early, consistent feedback to ktrs 4. Avoids duplication, variations - ONE central vs many distributed; Based on surveyed best practices and user expectations Recommendation Benefit
  • 43. Question: How do members get involved with SURF? Why are there “primary” and “secondary” members? Answer 1: The SURF team was established by Government SRDRWG members who were recommended/volunteered by each DoD service Answer 2: Primary members are included on CSDR S-R IPT email notifications for their specific service. Secondary members are contacted during periods of increased review demands, if necessary. SURF Secondary: SURF Primary: DCARC Analyst & STC: DoD William Raines Navy Corrinne Wallshein Wilson Rosa Stephen Palmer Philip Draheim Sarah Lloyd Marine Corps Noel Bishop John Bryant Air Force Ron Cipressi Janet Wentworth Chinson Yew Eric Sommer Army Jim Judy Jenna Meyers James Doswell Michael Smith Michael Duarte SPAWAR Jeremiah Hayden Min-Jung Gantt MDA Dan Strickland 43 SURF Team Structure SURF Process Summary & Initial Findings › Team is comprised of one primary member per service along with support from secondary team members (Government Only) › As submissions are received, SRDR review efforts will be distributed amongst SURF team members to balance workload › SURF Team Coordinators (STC): Marc Russo & Haset Gebre-Mariam › Current SURF structure: SURF Team Coordinators (STC) Marc Russo Haset Gebre-Mariam SURF Advisor & Process Owner (SAPO) Nick Lanham SRDR Submission received from DCARC
  • 44. Question: Did a standardized-joint service, software-specific quality review guide exist prior to the SURF V&V guide? Who contributed to the development of this document? Answer 1: No. Services implemented very inconsistent SRDR review methodologies (if conducted at all) prior to DCARC acceptance Answer 2: The SRDR V&V guide was developed by the SURF team and has been reviewed by numerous SRDRWG, OSD CAPE, and other cost community team members. Feedback from other services has generated significant improvements from initial draft. 44 SRDR V&V Guide SURF Process Summary & Initial Findings Guide represents first-ever, joint effort amongst DoD service cost agencies › OSD public release approved 5 April 2016 › Kickoff email distributed on 1 May 2017 to update guide with latest DID requirements › Files can be downloaded using following link: http://cade.osd.mil/roles/reviewers#surf Enables ability to consistently isolate software cost relationships and trends based on quality SRDR data › Now includes quick-reference MS excel question checklist by SRDR DID section Two main purposes: › SRDR V&V training guide (V&V questions) › Focus areas used to determine SRDR quality tags
  • 45. V&V Questions and Examples Developed and Organized by Individual SRDR reporting Variable 45 SRDR V&V Guide Table of Contents (TOC) SURF Process Summary & Initial Findings 1.0 Review of an SRDR submitted to DCARC 1.1 Reporting Event 1.2 Demographic Information 1.3 Software Char. and Dev. Process 1.3.1 Super Domain and Application Domains 1.3.2 Operating Environment (OE) Designation 1.3.3 Development Process 1.4 Personnel 1.5 Sizing and Language 1.5.1 Requirements 1.5.2 Source Lines of Code (SLOC) 1.5.3 Non-SLOC Based Software Sizing 1.5.4 Product Quality Reporting 1.6 Effort 1.7 Schedule 1.8 Estimate at Completion (EAC) Values 2.0 Quality Tagging 3.0 Solutions for Common Findings 3.1 Allocation 3.2 Combining 3.3 Early Acquisition Phase Combining 4.0 Pairing Data 5.0 Possible Automation Appendix A – SD and AD Categories Appendix B – Productivity Quality Tags Appendix C – Schedule Quality Tags Appendix D – SRDR Scorecard Process
  • 46. V&V Guide Includes Specific Questions For SURF Members to Confirm Prior to Accepting the Report 46 SRDR V&V Guide Example Questions SURF Process Summary & Initial Findings – Was effort data reported for each CSCI or WBS? – Was effort data reported as estimated or actual results? If the submission includes estimated values and actual results, does the report include a clear and documented split between actual results and estimated values? – Is the effort data reported in hours? – Is effort data broken out by activity? – What activities are covered in the effort data? Is there an explanation of missing activities included within the supporting SRDR data dictionary? …. When assessing Effort, the V&V priority is determining completeness Determining completeness is not always easy due to: – The contractor possibly collecting/reporting their actual performance using categories that differ from the IEEE 12207 standard – The contractor reporting all of their Effort within the Other category Common questions to ask when looking at the effort are: Section 1.6: Effort
  • 47. DCARC: Step 1 •SRDR status list sent to SURF Team Coordinator SURF: Step 1 •SRDR status list distributed to Primary and Secondary POCs SURF: Step 2 •Conduct V&V reviews by populating MS Excel question template SURF: Step 3 •Provide completed V&V question templates back to DCARC DCARC: Step 2 •Combine SURF and DCARC comments •Coordinate comment resolution with submitting organization Database: Step 1 •Adjudicated SRDR sent to NAVAIR 4.2 for data entry into DACIMs dataset •Note: Future database(s) will be hosted via CADE Varies by No. Submissions 47 SURF Team V&V Process SURF Process Summary & Initial Findings 1st week of every month +2 Days + 13 Days NLT + 14 Days Varies by Contractor Purpose of SURF Process: To provide completed V&V checklists to DCARC within 2 weeks of request Important Note: CADE is developing relational databases for new DID formats. Over time, data entry will be automated. Until that time, manual data entry continues by NAVAIR 4.2 team for only the development format. Please refer to V&V guide for additional automation details and future data quality initiatives Monthly Recurring SURF and DCARC Communications
  • 48. 48 SRDR Database Location SURF Process Summary & Initial Findings › Login to CADE and click on “CADE Portal Login” − http://cade.osd.mil/ › Select “My CADE” from the menu, and click on “CADE Library” › Use to Advanced Search and filter on “Container Types” under “Software Data” Question: Where does SRDR data go after SURF Review? Answer: Once SRDR record has been accepted, Data is entered into SRDR dataset posted to CADE>DACIMs web portal Question: Who enters the data into the dataset? Answer: Currently members from NAVAIR 4.2 enter data to SRDR dataset (10+ years of experience). Future data entry is planned to be automated using .XML schemas linked to latest DID formats
  • 49. › SRDR database is available to Government analysts with access to the CADE portal – This dataset is the authoritative source for SRDR data (10+ years of uploads) › Data is not automatically considered “Good” for analysis › SURF team may recommend DCARC not accept a submission due several data quality concerns outlined in the V&V guide. Examples include: – Roll-up of lower level data (Did not want to double count effect) – Significant missing content in hours, productivity, and/or SLOC data missing – Interim build actual that is not stand alone – Inconsistencies or oddities in the submit – Additional reasons discussed in the V&V guide 49 SRDR Data Quality Review SURF Process Summary & Initial Findings Data Segments Dec-07 Dec-08 Oct-10 Oct -11 Aug-13 Apr-14 Apr-15 Dec-16 CSCI Records 688 964 1473 1890 2546 2624 2853 3487 Completed program or build 88 191 412 545 790 911 1074 1326 Actuals considered for analysis (e.g., “Good”) 0 119 206 279 400 403 682 829 Paired Initial and Final Records 0 0 78 142 212 212 212 240 Dataset Posted to OSD CAPE DACIMS Web Portal
  • 50. SURF Team Combined With V&V Guide and DCARC Have Significantly Improved Software Data Quality › Prior to SURF process, only 15% of SRDR data was considered “Good” › After one+ year of SURF reviews, ~24% of data has been tagged as “Good” › Army team currently working to review historical data. Once completed, “Good” percentage will likely increase to ~31% 50 SRDR Data Quality Review SURF Process Summary & Initial Findings 1890 2546 2624 2853 3487 279 400 403 682 829 0 500 1000 1500 2000 2500 3000 3500 4000 11-Oct 13-Aug 14-Apr 15-Apr 16-Dec Total Records (e.g., CSCIs) Actuals considered for analysis (e.g., “Good”) 2011-2017 Trend Analysis
  • 51. 51 SURF Team Status SURF Process Summary & Initial Findings › Recurring SURF team meetings kicked off on 23 June 2015 – Group includes ~19 Government team members from across the DoD – Has received very positive feed back from DoD cost estimation community, DCARC analyst(s), and even program office communities since inception – Completed initial version of SRDR V&V guide March 2015 – Initiated SURF Initial team training using draft V&V guide June 2015 – Completed development of SURF team charter July 2015 – During training period, SURF generated 483 V&V comments provided to DCARC (June 2015 to March 2016) – Completed official SURF kickoff with DCARC and published V&V guide March 2016 – After training period, formal SURF process generated 889 V&V comments (March 2016 to December 2016) – Concluding CY16, SURF team generated 1,372 V&V comments from 92 SRDR submissions (1,282 during CY16) › Current Status – CY17 represents first full year of official SURF reviews using published V&V guide – Team recently kicked off effort to update existing V&V questions to align with the latest SRDR DID – Co-chaired Collaborative Cost Research Group (CCRG) focused on increasing “Good” SRDR records (March 2017) – Continued process improvement efforts to maintain efficient and effective process – Working with DCARC to develop SURF User Interface within CADE V&V Comments Have Significantly Improved SRDR Data Quality Key Facts As of June 2017
  • 52. 52 SURF Comments by V&V Category SURF Process Summary & Initial Findings All Reviews (Mar – Dec 2016)
  • 53. 53 SURF Team Findings by V&V Category SURF Process Summary & Initial Findings › V&V comments are generated when SURF members answer a question with “No” › Common trends for “No” responses: – Reports not including all types of requirement counts (e.g., new, modified, inherited, deleted, cybersecurity, etc.) – Reports not including COTS/GOTS “glue code” Software Lines of Code (SLOC) totals – Reports not including SLOC counts using USC Unified Code Count (UCC) tool – Reports not including software defect counts – Reports not including a subset of required metadata (e.g., TD, EMD, Prototype, Service, Milestone, Contract Type, etc.) › Common trends for “Yes” responses: – Reports include a subset of metadata (e.g., contractor, contract number, report type, report POC, program name, etc.) – Reports typically have SLOC counts broken out by new, modified, reuse, auto, and deleted – Reports typically include primary language type designation – Reports typically include “Effort” broken out by activity › Common trends for “N/A” responses: – Reports typically do not include “Forecast At Completion (FAC)” values – Reports typically do not include non-SLOC sizing metrics (Function Points, RICE-FW, etc.) – SURF analyst typically does not have access to corresponding CSDR plan (Working with CADE to develop SURF portal) All Reviews (Mar – Dec 2016)
  • 54. 54 SURF V&V Guide SURF Process Summary & Initial Findings Currently, SURF members are updating or creating draft question lists to account for new DIDs for Development, Maintenance, and ERP › Updates to the development question lists include improvements to the list from lessons learned over the previous year Draft Question lists will then to be sent out to a larger SRDR-focused team members to ensure questions list are reasonable and that they address quality data concerns › Important to keep question lists to a reasonable size for continued SURF success V&V guide and question templates to be updated to incorporate new questions as well as other lessons learned Updated V&V to larger SRDR working group and senior management for final comments/feedback Send Updated V&V guide to OSD for final PAO approval and posting to CADE Process for Update
  • 55. 55 SURF Summary SURF Process Summary & Initial Findings SURF is focused on improving data quality and helping support robust Government review process We would like to thank all of the DoD and Non-DoD individuals who have commented, participated, and provided feedback throughout the past few years Please feel free to use the contact information below if you would like more information regarding SURF, the SRDR V&V Guide, or checklist Marc Russo Naval Center for Cost Analysis (NCCA) NIPR: Marc.russo1@navy.mil Ron Cipressi Air Force Cost Analysis Agency NIPR: Ronald.p.cipressi.civ@mail.mil Nicholas Lanham Naval Center for Cost Analysis (NCCA) NIPR: Nicholas.lanham@navy.mil Dan Strickland Missile Defense Agency (MDA) NIPR: Daniel.strickland@mda.mil
  • 56. COST ASSESSMENT DATA ENTERPRISE SURF Process Summary Backup 5 6
  • 57. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.) 1.5.1.1 Does the submission clearly illustrate the number of Inherited, Added, Modified, Deleted, and Deferred requirements for both internal and external categories? 31 0 1 0 1.5.2.16 If COTS or GOTS items have been included within the submission, has the submitting organization provided the SLOC total required to integrate the identified COTS/GOTS product (i.e. Glue code)? 28 0 4 0 1.5.1.2 Has the submitting organization separated the provided requirements by Security, Safety, and Privacy or Cybersecurity? 26 1 5 0 1.5.2.4 Did the submitter us the Aerospace-approved version of the University of Southern California (USC) Center for Systems and Software Engineering (CSSE) Unified Code Count (UCC) tool to count the provided SLOC totals? If not, was the name of the code counting tool used by the submitting organization included within the supporting comments section and/or data dictionary? 25 6 1 0 1.2.5 Is the system description been included within the submission? 24 8 0 0 1.2.2 Has the Major Defense Acquisition Program (MDAP) or Major Automated Information System (MAIS) designation been listed? 20 2 10 0 1.3.2.3 Has the state of development been identified (For example: Prototype, Production Ready, or a mix of the two)? 19 11 2 0 1.5.4.2 Has the priority level for each category of software defects been provided? 18 9 5 0 1.1.9 Is it clear if the information represents a Technology Demonstration (TD) or Engineering and Manufacturing Development (EMD) phase if the program is in that stage of development? 17 8 7 0 1.5.2.13 Has the contractor or submitting organization provided the name of the software products that have been referenced to generate the provided reuse SLOC totals? 17 14 1 0 1.5.4.1 Has the submitting organization provided a breakout of the number of software defects Discovered, Removed, and Deferred? 17 10 5 0 1.7.2 Has the submitting organization clearly stated if the provided schedule data was reported as estimated, allocated, or actual results? 16 16 0 0 1.2.16 Is the specific U.S Military service branch or customer identified (For example: Navy, Air Force, Army, prime contractor, etc.)? 15 14 3 0 1.3.1.1 Does the SRDR submission, comments section, or data dictionary include a clear system level functional description and software operational overview? 15 17 0 0 1.2.6 Have the program phase and/or milestone been included within the report (for example: Pre-A, A, B, C-LRIP, C-FRP, O&S, etc.)? 14 18 0 0 1.3.2.1 Does the SRDR data dictionary include a clear system-level functional description and software operational overview? 14 17 1 0 1.2.19 Has the contract Period of Performance (PoP) been identified? 13 19 0 0 1.2.23 Does the submission include adequate detail within the comments section to support analysts who may reference the submission sometime in the future (For example: Provide context for analyzing the provided data, such as any unusual circumstances that may have caused the data to diverge from historical norms)? 13 18 1 0 1.3.3.3 If an upgrade, does the SW sizing reflect significant reuse or modification SLOC totals when compared to New code? 13 16 2 1 1.4.5 Does the peak headcount make sense against the reported schedule and hours? A simple test is to divide the total reported hours by the schedule months and then convert the resulting average monthly hours into an average Full Time Equivalent (FTE) count using the reported hours in a man-month. The peak headcount must be higher than this FTE monthly average. At the same time the peak headcount should not be wildly disproportional to that average either. 13 17 1 1 1.5.2.10 Were code adaptation factors reported (percent redesign, recode, reintegration)? Do they appear to be unique for each CSCI, or are they standard rules of thumb? 13 4 15 0 1.2.17 Has the specific contract type been identified? For contracts, task orders, or delivery orders with multiple CLINs of varying contract types, the Contract Type reporting should be the one associated with the plurality of cost. 12 20 0 0 1.2.22 Has the funding appropriation been identified (for example: Research, Development, Test and Evaluation (RDT&E), Procurement, Operation and Maintenance (O&M), Foreign Military Sales (FMS), etc.)? 12 20 0 0 1.2.4 Has the Defense material item category been provided in accordance with MIL-STD-881C guidance (for example: Aircraft, radar, ship, Unmanned Ariel Vehicle (UAV) system)? 12 12 8 0 1.3.3.5 Has the development method also been identified (for example: Structured Analysis, Object Oriented, Vienna Development, etc.)? 12 20 0 0 1.6.2 Was effort data reported as estimated or actual results? If the submission includes estimated values and actual results, does the report include a clear and documented split between actual results and estimated values? 12 18 2 0 V&V Questions Most Frequently With “No” Response SURF Process Summary & Initial Findings All Reviews (Mar – Dec 2016) 57
  • 58. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.) 1.2.12 Is the contract number reported? 0 32 0 0 1.2.20 Has the report type been identified (for example: Initial, Interim, or Final)? 0 32 0 0 1.2.7 Has the contractor or organization that performed the work been identified? 0 32 0 0 1.2.21 Is there a single submission Point of Contact (POC) and supporting contact information included within the report? 3 29 0 0 1.7.1 Has schedule data been included in the submission? 3 29 0 0 1.7.4 Is schedule data broken out by SRDR activity? 3 29 0 0 1.1.2 Does the report reference the CSDR Plan? 4 28 0 0 1.2.1 Has the program name been identified? 4 28 0 0 1.5.2.1 Was the primary programming language reported? 4 28 0 0 1.5.2.6 Are the SLOC counts for different types of code (e.g., new, modified, reused, auto-generated, Government-furnished, and deleted) separated or are they mixed together? 4 28 0 0 1.6.3 Is the effort data reported in hours? 4 28 0 0 1.6.4 Is effort data broken out by activity? 4 28 0 0 1.6.5 Was the specific ISO 12207:2008 activities that are covered in the effort data (For example: Requirements analysis, architectural design, detailed design, construction, integration, qualification testing, and support processes) clearly discernable? 4 28 0 0 1.1.6 Is there an easily identifiable event associated with the submission (for example: Contract Award, Build 2 Release, Build 1 Complete, Contract Complete, etc.)? 5 27 0 0 1.6.1 Was effort data reported for each CSCI or WBS? 5 27 0 0 1.2.14 Is the software process maturity and quality reporting definition provided (For example: Capability Maturity Model (CMM), Capability Maturity Model Integration (CMMI), or other alternative rating)? 4 27 1 0 1.7.3 Has schedule data been reported in number of months from contract start or as calendar dates? 3 27 2 0 1.2.13 Are precedents reported and consistent from submission to submission? 1 27 4 0 1.3.3.4 What precedents or prior builds are identified to give credibility to the upgrade designation? 1 27 3 1 1.3.3.2 Has the contractor indicated whether the software is an upgrade or new development? If not, why not? 6 26 0 0 1.4.3 Does the data dictionary define what the skill level requirements are, and is the contractor adhering to that definition? 6 26 0 0 1.2.15 Is the Process Maturity rating reported with an associated date, and has it changed from a prior submission? 3 26 3 0 1.2.10 Has the contractor or submitting organization illustrated whether they were the primary or secondary developer? 7 24 1 0 1.6.6 Were common WBS elements/labor categories such as System Engineering (SE), Program Management (PM), Configuration Management (CM), or Quality Management (QM) been broken out separately? 7 24 1 0 1.7.5 Does the report include unique schedule start and end date values? For example, do multiple records have the same schedule data, e.g., same calendar dates for multiple WBS/CSCIs or builds? 7 24 1 0 1.2.3 Is the Prime Mission Product (PMP) name been clearly identified (for example: most current official military designation? 5 24 3 0 V&V Questions Most Frequently With “Yes” Response SURF Process Summary & Initial Findings All Reviews (Mar – Dec 2016) 58
  • 59. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.) 1.5.3.2 If function points have been provided has the submitting organization clearly illustrated the function point count type (For example: Enhancement Project, Application, or Development Project)? 0 0 32 0 1.5.3.3 Has the submitting organization provided the number of Data Functions and Transactional Functions (For example: Internal Logic Files, External Interface File, External Inquiries, External Inputs, and External Outputs)? 0 0 32 0 1.5.3.4 Has the submitting organization included the Value Adjustment Factor? 0 0 32 0 1.5.3.5 If the submitting organization has provided sizing metrics using the Reports, Interfaces, Conversions, Extensions, Forms, and Workflows (RICE-FW) convention, has the complexity of each RICE-FW category been provided? 0 0 32 0 1.8.1 FACH: Has a description been provided that describes which ISO 12207:2008 elements have been included within the provided total? 1 0 31 0 1.8.2 FACH: Do sub-element FAC values sum to the parent FAC total value? 1 0 31 0 1.8.3 If the report is a final report, does the provided ATD total match the provided FAC total? 1 0 31 0 1.1.1 Is the submission compliant with the CSDR Plan, i.e., a comparison of the submission to the plan requirement? 1 1 30 0 1.5.3.1 Were SLOC counts reported, or were other counting or sizing metrics used (e.g. function points, use cases, rung logic ladders, etc.)? If so, has the submitting organization obtained the appropriate authorization to report non-SLOC based sizing within the corresponding CSDR plan? 0 3 29 0 1.6.17 If subcontractor hours have not been provided, did the reporting organization provide subcontractor dollars? 2 1 29 0 1.5.2.17 If COTS or GOTS integration or glue code has been included within the submission, does the total seem realistic when compared to the total SLOC included in the CSCI or WBS element (For example: COTS integration code equals 500 KSLOC and the total SLOC for the specific CSCI or WBS element equals 150 KSLOC)? note: this scenario sometime occurs when the submitting organization counts the total SLOC of the specified COTS or GOTS product vice the integration or glue code required to integrate the product. 3 0 28 1 1.6.9 Do the children or lower-level WBS/CSCI elements add up to the parent? If not, is there effort that is only captured at a higher-level WBS/CSCI level that should be allocated to the lower-level WBS/CSCI elements? 5 3 24 0 1.2.18 Has the total contract price been identified? 9 0 23 0 1.5.1.3 Do the number of requirements trace from the parent to the children in the WBS? If not, this could imply that some portion of the software effort is only captured at higher-level WBS/ CSCI elements and should be cross checked. 3 4 23 2 1.5.2.9 For a Final report does the size look realistic? For example: is all of the code rounded to the nearest 1000 lines, or does the dictionary indicate that they had difficulty counting code that may have come from a subcontractor? 1 9 22 0 1.1.7 If there are prior submissions, is this submission an update to a prior submission or a new event? If the submission is an update to an existing submission, does the latest submission clearly describe what report the prior submission is linked to? 2 8 21 1 1.2.11 If effort was outsourced, has the outsourced organization been provided? 4 7 21 0 1.6.7 Is there an explanation of missing activities included within the supporting SRDR data dictionary? 7 4 21 0 1.1.8 If a prior submissions exists, is the information that has changed readily identifiable and a reason for the change provided (either in the data dictionary or comments section)? 5 6 20 1 1.4.4 Does the skill mix make sense relative to the complexity of the code (unusual amount of very low or very high skill mix, for example)? 0 12 20 0 1.5.4.3 If the report is an interim or final submission, has the number of Discovered, Removed, and Deferred defects changed from the previous submission? If significant changes have occurred, does the supporting comments section and/or data dictionary provide details regarding what drove the significant change in product quality metrics? 10 2 20 0 1.6.12 Does the submission include unique values for each of the lower-level CSCI or WBS elements? For example, do multiple related records have the same effort data (i.e. activity effort is repeated or total effort is repeated)? 6 6 20 0 1.4.2 If there was a prior submission, has the skill mix changed dramatically and, if so, is there an explanation why? Conversely, did it remain unchanged? If so, why? 8 4 19 1 1.5.2.14 When subcontractor code is present, is it segregated from the prime contractor effort, and does it meet the same criteria for quality as the prime’s code count? 6 7 19 0 V&V Questions Most Frequently With “N/A” Response SURF Process Summary & Initial Findings All Reviews (Mar – Dec 2016) 59

Editor's Notes

  1. The reports are not management reports and are not intended for tracking progress
  2. Development CSCI is the lowest level of software development at which configuration management is performed by the developer. usually indicated by a separate Software Development Folder (SDF), Software Requirements Specification (SRS) etc. The CSDR plan will serve as the authoritative definition for reporting purposes Maintenance The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level. ERP Part 1, System Technical Data, provides the report context, describes both the software product being developed and the development team, lists project requirements for modules, business processes, system functions and interfaces, and captures software product size and complexity at the Release Level. Part 2, Software Development Effort Data, captures project resource and schedule information at the Release Level. Format 3 uses the term “release” to refer to commonly used terms such as build, product build, and increment, but this document will use release as the primary term throughout.
  3. Format 2 The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
  4. The reports are not management reports and are not intended for tracking progress
  5. CSCI is the lowest level of software development at which configuration management is performed by the developer. usually indicated by a separate Software Development Folder (SDF), Software Requirements Specification (SRS) etc. The CSDR plan will serve as the authoritative definition for reporting purposes
  6. Introducing Agile Measures supports capturing the LOE with this SW dev methodology
  7. The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
  8. The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
  9. . Part 1, System Technical Data, provides the report context, describes both the software product being developed and the development team, lists project requirements for modules, business processes, system functions and interfaces, and captures software product size and complexity at the Release Level. Part 2, Software Development Effort Data, captures project resource and schedule information at the Release Level. Format 3 uses the term “release” to refer to commonly used terms such as build, product build, and increment, but this document will use release as the primary term throughout.
  10. . Similar look and feel to Formats 1 & 2
  11. . Similar look and feel to Formats 1 & 2
  12. . Similar look and feel to Formats 1 & 2
  13. . Similar look and feel to Formats 1 & 2
  14. . Similar look and feel to Formats 1 & 2
  15. .
  16. .
  17. .