The document provides an overview of the Software Resource Data Reporting (SRDR) forms and Data Item Description (DID). It describes the three formats: Format 1 for software development, Format 2 for maintenance reporting, and Format 3 for Enterprise Resource Planning programs. Recent updates include adding measures for agile development methodologies to Format 1 and clarifying software change definitions for Format 2. The intent is to collect standardized software cost and metrics data for use by DoD and industry analysts.
This document is a System Requirements Specification (SRS) for a system developed for the United States Department of Agriculture Food & Nutrition Service. The SRS outlines the purpose, scope, functional requirements, interface requirements, data requirements, operational requirements, user classes, and guidelines for developing requirements for the system. It serves as an agreement between the developer and customer on what capabilities the system must provide.
Registration System for Training Program in STCalraee
The document summarizes two IT projects completed by the author during an internship at Saudi Telecom Company (STC):
1. A Registration System for Training Programs (RSTP) to automate trainee registration and tracking. The system was developed using Visual Basic and an Access database.
2. A Task Management System (TMS) to manage multi-level tasks with email notifications. Both systems are now in use at STC.
The author concludes they gained experience in project management, programming skills like VB and SQL, and recommends tracking systems to enhance internship management between universities and companies.
This document provides an overview of the general store inventory system design project. It includes sections that describe the context diagram, data flow diagrams, entity relationship diagram, class diagram, activity diagram, sequence diagrams, and architecture design. The document also lists the various forms and reports that will be used in the system, such as products, customers, suppliers, transactions, sales, and stock management.
Software Development Plan of Fixed Asset Management SystemNasiruddin Juel
The document outlines a software development plan for Ain o Salish Kendra (ASK) using a waterfall development approach. It describes the six stages of the waterfall model - planning, requirements definition, design, development, integration and testing, and installation and acceptance. Each stage has predefined processes and deliverables. The goal is to build the software in a structured, top-down manner to result in a quality product that meets requirements.
Example outcome of a worldwide IBM Notes Domino application analysis, including usage scan. Customers willing to decide on strategic options regarding the future of IBM Notes and Domino may want to take a look at our offerings: http://www.insight-notes.com
This technical report proposes a set of 17 software metric cards to help organizations establish a measurement plan for automotive software projects. Each metric card defines a software metric, including its purpose, applicable entity and attribute, calculation method, and examples. The metrics are classified according to entity (e.g. process, product) and attribute (e.g. maintainability, reliability) to help ensure a balanced set of measures. The report is intended to foster a culture of measurement and promote selecting the right amount and types of metrics for a project.
The document provides a summary of work completed in Sprint 1 of an overall project. It indicates that 80% of the recommendation engine design for account recommendations was completed, the SQL database data model was created, and DevOps setup was configured. Progress updates are given for various phases including requirements gathering, solution design, core development, and Salesforce development. Metrics are also presented on sprint capacity, points completed, and velocity. Features delivered in Sprint 1 are listed along with any current risks, issues, and planned mitigations.
The document outlines the objectives and key concepts covered in Chapter 14 of the textbook "Accounting Information Systems, 6th edition". The objectives include the in-house development phase of the SDLC, tools used such as CASE and PERT/Gantt charts, structured vs object-oriented design approaches, documentation types, and the commercial software option. It then covers the phases of SDLC in more detail including in-house development, commercial packages, and maintenance. Design approaches like structured and object-oriented are defined. Documentation, testing, training and post-implementation review are discussed as part of system delivery.
This document is a System Requirements Specification (SRS) for a system developed for the United States Department of Agriculture Food & Nutrition Service. The SRS outlines the purpose, scope, functional requirements, interface requirements, data requirements, operational requirements, user classes, and guidelines for developing requirements for the system. It serves as an agreement between the developer and customer on what capabilities the system must provide.
Registration System for Training Program in STCalraee
The document summarizes two IT projects completed by the author during an internship at Saudi Telecom Company (STC):
1. A Registration System for Training Programs (RSTP) to automate trainee registration and tracking. The system was developed using Visual Basic and an Access database.
2. A Task Management System (TMS) to manage multi-level tasks with email notifications. Both systems are now in use at STC.
The author concludes they gained experience in project management, programming skills like VB and SQL, and recommends tracking systems to enhance internship management between universities and companies.
This document provides an overview of the general store inventory system design project. It includes sections that describe the context diagram, data flow diagrams, entity relationship diagram, class diagram, activity diagram, sequence diagrams, and architecture design. The document also lists the various forms and reports that will be used in the system, such as products, customers, suppliers, transactions, sales, and stock management.
Software Development Plan of Fixed Asset Management SystemNasiruddin Juel
The document outlines a software development plan for Ain o Salish Kendra (ASK) using a waterfall development approach. It describes the six stages of the waterfall model - planning, requirements definition, design, development, integration and testing, and installation and acceptance. Each stage has predefined processes and deliverables. The goal is to build the software in a structured, top-down manner to result in a quality product that meets requirements.
Example outcome of a worldwide IBM Notes Domino application analysis, including usage scan. Customers willing to decide on strategic options regarding the future of IBM Notes and Domino may want to take a look at our offerings: http://www.insight-notes.com
This technical report proposes a set of 17 software metric cards to help organizations establish a measurement plan for automotive software projects. Each metric card defines a software metric, including its purpose, applicable entity and attribute, calculation method, and examples. The metrics are classified according to entity (e.g. process, product) and attribute (e.g. maintainability, reliability) to help ensure a balanced set of measures. The report is intended to foster a culture of measurement and promote selecting the right amount and types of metrics for a project.
The document provides a summary of work completed in Sprint 1 of an overall project. It indicates that 80% of the recommendation engine design for account recommendations was completed, the SQL database data model was created, and DevOps setup was configured. Progress updates are given for various phases including requirements gathering, solution design, core development, and Salesforce development. Metrics are also presented on sprint capacity, points completed, and velocity. Features delivered in Sprint 1 are listed along with any current risks, issues, and planned mitigations.
The document outlines the objectives and key concepts covered in Chapter 14 of the textbook "Accounting Information Systems, 6th edition". The objectives include the in-house development phase of the SDLC, tools used such as CASE and PERT/Gantt charts, structured vs object-oriented design approaches, documentation types, and the commercial software option. It then covers the phases of SDLC in more detail including in-house development, commercial packages, and maintenance. Design approaches like structured and object-oriented are defined. Documentation, testing, training and post-implementation review are discussed as part of system delivery.
The document provides a design specification for a sports score system with speech recognition capabilities. It includes a high-level overview of the system architecture with four main subsystems: a server application, client application, sports score database, and dialog database. The document then describes each subsystem and component in more detail, including interfaces, data flows, and design considerations.
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...Paulo Lacerda
This document summarizes Caixa Econômica Federal's use of IBM Rational Insight for performance measurement and oversight of outsourced software development. Key points:
1) Caixa needed visibility and metrics across distributed development units to support business decisions and compare outsourced software factories.
2) Rational Insight was deployed to extract and consolidate metrics from various tools into executive dashboards measuring factories, teams, projects, and KPIs over time.
3) Caixa now has automated, real-time performance measurement across the organization with a single point of access for metrics and reports. This supports improved software development decision making.
Software Requirement Specification (SRS) on Result Analysis ToolMinhas Kamal
Software Requirement Specification (SRS) on Result Analysis Tool. Chapters- Inception, Elicitation, Scenario-Based Model, Data Model, Class-Based Model, Flow-Oriented Model and Behavioral Model.
Documented in 3rd year of Bachelor of Science in Software Engineering (BSSE) course at Institute of Information Technology, University of Dhaka (IIT, DU).
Find a Demo at: https://github.com/MinhasKamal/ResultAnalysisTool
Visualizing operational activities through dashboardsWSO2
Visualizing operational activities through dashboards provides insight into enterprise data. The WSO2 App Factory collects and publishes data on applications, teams, and activities. It uses various technologies to store, summarize, and visualize this data through customizable dashboards. This enables decision making and responsiveness to move towards enterprise agility. The demonstration showed dashboard capabilities, with plans to publish more activities and customize dashboards for different roles.
This document contains the resume of Sudheer Babu Madisetty detailing his professional experience as an SAP ABAP consultant. He has over 10 years of experience implementing and supporting SAP modules like FI, CO, MM, SD, PP, and QM. Some of his key skills include ABAP programming, custom development, classical and interactive report programming, ALE/IDOC, and SAP upgrades. Several projects are summarized, including an SAP re-platforming project for Chevron Oronite Japan and implementing shelf life functionality for materials.
Chapter 11 Metrics for process and projects.pptssuser3f82c9
This document discusses software process and project metrics. It describes two types of metrics - process metrics and project metrics. Process metrics are collected across projects over long periods of time to enable long-term process improvement. Project metrics enable project managers to assess project status, track risks, uncover problems, adjust work, and evaluate team ability. Measurement data is collected by projects and converted to process metrics for software improvement.
- Manoj Kumar has 5 years of experience as a software professional specializing in data warehousing and ETL development using Informatica and Oracle database.
- He has experience designing and implementing complex ETL mappings including Slowly Changing Dimensions types 2 and 3.
- Manoj seeks new opportunities as an ETL developer where he can utilize his skills in Informatica, Oracle, shell scripting and more.
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ AgileTejaswi Desai
This document provides a summary of Tejaswi Desai's work experience and qualifications. She has over 12 years of experience in software engineering across roles such as developer, program management, and delivery management. She has expertise in Microsoft technologies and experience with delivery models like Agile. She holds an MCA and is MCPDEA certified. She has worked on projects for clients such as Accenture, ICICI Lombard, and JP Morgan Chase.
This document outlines an IT roadmap with four levels - basic, standardized, rationalized, and dynamic - and describes the role and value of IT at each level. It also provides an overview of an Enterprise Agreement framework with Microsoft, including quarterly meetings to review progress, pilot implementations, and license management. The goal is to help customers advance along the IT roadmap by leveraging Enterprise Agreement benefits and joint planning with Microsoft and its partners.
The document describes an automated hardware maintenance system project presented by Prince Bhanwra for Hero Cycles Ltd. It discusses the company profile, issues with their current system, and the benefits of implementing an ERP system like SAP. It then outlines the objectives, features, and process flow of the proposed Automated Hardware Maintenance System project which would integrate complaint logging and management into their existing SAP software.
The document describes an automated hardware maintenance system project presented by Prince Bhanwra for Hero Cycles Ltd. It discusses the company profile, issues with their current system, and the benefits of implementing an ERP system like SAP. It then outlines the objectives, features, and process flow of the proposed Automated Hardware Maintenance System project which would integrate complaint logging and management into their existing SAP software.
Experienced IT professional with sound experience in Implementation, Testing, Application Support and Documentation. Have Managing and Implementation skills of the projects, Excellent Communication skills and ability to manage internal as well as external customers
The document discusses setting up a business intelligence (BI) shared service at a large multinational bank with over 150 years of history. It describes the bank's current size and geographic presence, as well as the challenges with the previous BI approach that relied heavily on Excel. The shared service aims to standardize reporting through a centralized team that will deliver dashboards, canned reports, and empower self-service analytics using MicroStrategy. Key aspects of the shared service include the governance model, templates, and staffing structure to support the Center of Excellence.
This document discusses different types of software metrics including process, product, and project metrics. It defines metrics as quantitative measures of attributes and discusses how they can be used as indicators to improve processes and projects. Process metrics measure attributes of the development process over long periods of time. Product metrics measure attributes of the software at different stages. Project metrics are used to monitor and control projects. The document also discusses size-oriented and function-oriented metrics for normalization and comparison purposes. It provides examples of calculating function points and deriving metrics like errors per function point.
Lean product management for web2.0 by Sujoy Bhatacharjee, April Triggr In
The document discusses lean product management for web startups. It explains that lean product management involves launching with minimal funding and publicity, getting early customer feedback, and quickly adapting based on that feedback through multiple iterations until the product is ready for wider launch. It also provides examples of roadmaps, processes, and roles for agile product development like Scrum.
The document outlines a 10 month plan for implementing a data analytics program with 3 phases:
1) Requirements gathering, design, and piloting over months 1-3
2) Data integration, model building, and testing in months 4-6
3) User acceptance testing, deployment, and data migration in months 7-10
Key activities include requirement analysis, ETL development, reporting/dashboards, testing, training, and go-live activities. Sprints will focus on specific tracks like the data warehouse, ETL processes, and analytics/reporting.
Project Configurator is a software system that automates project planning processes for SAP-ERP technology projects. It allows users to select processes and sub-processes, allocate human resources, and calculate estimated costs. The software maintains a database of employees and their details to assist in resource allocation. It also accounts for currency exchange rates when providing cost estimates to globally distributed clients. The system aims to streamline planning tasks and eliminate manual overhead through an online, user-friendly interface.
The document summarizes an Oracle PeopleSoft HRMS application upgrade at Canara Bank. It discusses:
1) The drivers for the upgrade to newer Oracle database and PeopleTools versions for support until 2030.
2) The benefits of the upgrade including improved user experience, reuse of existing investments, and new features for payroll, security, and reports.
3) Future enhancements planned like mobile integration, WorkCentre improvements, and integration with other applications.
4) Highlights of the 7-month upgrade project including retrofitting 85 custom modules, payroll reconciliation, and key challenges of data migration and retrofitting hardcoded dates.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
The document provides a design specification for a sports score system with speech recognition capabilities. It includes a high-level overview of the system architecture with four main subsystems: a server application, client application, sports score database, and dialog database. The document then describes each subsystem and component in more detail, including interfaces, data flows, and design considerations.
Case Study: How Caixa Econômica in Brazil Uses IBM® Rational® Insight and Per...Paulo Lacerda
This document summarizes Caixa Econômica Federal's use of IBM Rational Insight for performance measurement and oversight of outsourced software development. Key points:
1) Caixa needed visibility and metrics across distributed development units to support business decisions and compare outsourced software factories.
2) Rational Insight was deployed to extract and consolidate metrics from various tools into executive dashboards measuring factories, teams, projects, and KPIs over time.
3) Caixa now has automated, real-time performance measurement across the organization with a single point of access for metrics and reports. This supports improved software development decision making.
Software Requirement Specification (SRS) on Result Analysis ToolMinhas Kamal
Software Requirement Specification (SRS) on Result Analysis Tool. Chapters- Inception, Elicitation, Scenario-Based Model, Data Model, Class-Based Model, Flow-Oriented Model and Behavioral Model.
Documented in 3rd year of Bachelor of Science in Software Engineering (BSSE) course at Institute of Information Technology, University of Dhaka (IIT, DU).
Find a Demo at: https://github.com/MinhasKamal/ResultAnalysisTool
Visualizing operational activities through dashboardsWSO2
Visualizing operational activities through dashboards provides insight into enterprise data. The WSO2 App Factory collects and publishes data on applications, teams, and activities. It uses various technologies to store, summarize, and visualize this data through customizable dashboards. This enables decision making and responsiveness to move towards enterprise agility. The demonstration showed dashboard capabilities, with plans to publish more activities and customize dashboards for different roles.
This document contains the resume of Sudheer Babu Madisetty detailing his professional experience as an SAP ABAP consultant. He has over 10 years of experience implementing and supporting SAP modules like FI, CO, MM, SD, PP, and QM. Some of his key skills include ABAP programming, custom development, classical and interactive report programming, ALE/IDOC, and SAP upgrades. Several projects are summarized, including an SAP re-platforming project for Chevron Oronite Japan and implementing shelf life functionality for materials.
Chapter 11 Metrics for process and projects.pptssuser3f82c9
This document discusses software process and project metrics. It describes two types of metrics - process metrics and project metrics. Process metrics are collected across projects over long periods of time to enable long-term process improvement. Project metrics enable project managers to assess project status, track risks, uncover problems, adjust work, and evaluate team ability. Measurement data is collected by projects and converted to process metrics for software improvement.
- Manoj Kumar has 5 years of experience as a software professional specializing in data warehousing and ETL development using Informatica and Oracle database.
- He has experience designing and implementing complex ETL mappings including Slowly Changing Dimensions types 2 and 3.
- Manoj seeks new opportunities as an ETL developer where he can utilize his skills in Informatica, Oracle, shell scripting and more.
Tejaswi Desai Resume ASP Dot Net WPF WCF MVC LINQ AgileTejaswi Desai
This document provides a summary of Tejaswi Desai's work experience and qualifications. She has over 12 years of experience in software engineering across roles such as developer, program management, and delivery management. She has expertise in Microsoft technologies and experience with delivery models like Agile. She holds an MCA and is MCPDEA certified. She has worked on projects for clients such as Accenture, ICICI Lombard, and JP Morgan Chase.
This document outlines an IT roadmap with four levels - basic, standardized, rationalized, and dynamic - and describes the role and value of IT at each level. It also provides an overview of an Enterprise Agreement framework with Microsoft, including quarterly meetings to review progress, pilot implementations, and license management. The goal is to help customers advance along the IT roadmap by leveraging Enterprise Agreement benefits and joint planning with Microsoft and its partners.
The document describes an automated hardware maintenance system project presented by Prince Bhanwra for Hero Cycles Ltd. It discusses the company profile, issues with their current system, and the benefits of implementing an ERP system like SAP. It then outlines the objectives, features, and process flow of the proposed Automated Hardware Maintenance System project which would integrate complaint logging and management into their existing SAP software.
The document describes an automated hardware maintenance system project presented by Prince Bhanwra for Hero Cycles Ltd. It discusses the company profile, issues with their current system, and the benefits of implementing an ERP system like SAP. It then outlines the objectives, features, and process flow of the proposed Automated Hardware Maintenance System project which would integrate complaint logging and management into their existing SAP software.
Experienced IT professional with sound experience in Implementation, Testing, Application Support and Documentation. Have Managing and Implementation skills of the projects, Excellent Communication skills and ability to manage internal as well as external customers
The document discusses setting up a business intelligence (BI) shared service at a large multinational bank with over 150 years of history. It describes the bank's current size and geographic presence, as well as the challenges with the previous BI approach that relied heavily on Excel. The shared service aims to standardize reporting through a centralized team that will deliver dashboards, canned reports, and empower self-service analytics using MicroStrategy. Key aspects of the shared service include the governance model, templates, and staffing structure to support the Center of Excellence.
This document discusses different types of software metrics including process, product, and project metrics. It defines metrics as quantitative measures of attributes and discusses how they can be used as indicators to improve processes and projects. Process metrics measure attributes of the development process over long periods of time. Product metrics measure attributes of the software at different stages. Project metrics are used to monitor and control projects. The document also discusses size-oriented and function-oriented metrics for normalization and comparison purposes. It provides examples of calculating function points and deriving metrics like errors per function point.
Lean product management for web2.0 by Sujoy Bhatacharjee, April Triggr In
The document discusses lean product management for web startups. It explains that lean product management involves launching with minimal funding and publicity, getting early customer feedback, and quickly adapting based on that feedback through multiple iterations until the product is ready for wider launch. It also provides examples of roadmaps, processes, and roles for agile product development like Scrum.
The document outlines a 10 month plan for implementing a data analytics program with 3 phases:
1) Requirements gathering, design, and piloting over months 1-3
2) Data integration, model building, and testing in months 4-6
3) User acceptance testing, deployment, and data migration in months 7-10
Key activities include requirement analysis, ETL development, reporting/dashboards, testing, training, and go-live activities. Sprints will focus on specific tracks like the data warehouse, ETL processes, and analytics/reporting.
Project Configurator is a software system that automates project planning processes for SAP-ERP technology projects. It allows users to select processes and sub-processes, allocate human resources, and calculate estimated costs. The software maintains a database of employees and their details to assist in resource allocation. It also accounts for currency exchange rates when providing cost estimates to globally distributed clients. The system aims to streamline planning tasks and eliminate manual overhead through an online, user-friendly interface.
The document summarizes an Oracle PeopleSoft HRMS application upgrade at Canara Bank. It discusses:
1) The drivers for the upgrade to newer Oracle database and PeopleTools versions for support until 2030.
2) The benefits of the upgrade including improved user experience, reuse of existing investments, and new features for payroll, security, and reports.
3) Future enhancements planned like mobile integration, WorkCentre improvements, and integration with other applications.
4) Highlights of the 7-month upgrade project including retrofitting 85 custom modules, payroll reconciliation, and key challenges of data migration and retrofitting hardcoded dates.
Similar to SRDR Software Reporting DID Training - Nov. 2017.pptx (20)
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
2. 2
Overview
Software Resource Data Reporting (SRDR)
Resources
Data Item Description (DID):
DI-MGMT-82035
Development, Format 1:
DD Form 3026-1
Maintenance, Format 2:
DD Form 3026-2
Enterprise Resource Planning (ERP):
DD Form 3026-3
Intent of SRDR is to collect objective and measurable data
commonly used by industry and DoD cost analysts
Builds a repository of estimated and actual:
› SW product sizes
› Schedules
› Effort
› Quality
3. 3
New Formats
Software Resource Data Reporting (SRDR)
Technical Data
SW size, context, technical
information
Release level and computer SW
configuration item (CSCI) level
sections
Effort Data
Reports SW efforts associated
with each reported release and
CSCI
Technical Data
SW size, context, technical
information
Top level and release level
sections
Effort Data
Reports the to-date SW
maintenance efforts for each in-
progress and completed
release(s), and total maintenance
activities
Technical Data
Provides context, SW product,
Development Team, etc.
Effort Data
Project resource and schedule
information at the release level
Part
1
Part
2
Development Maintenance ERP
4. 4
Updates
Software Resource Data Reporting (SRDR)
No Additional Significant Updates to Format 1 and 2
Format 1 Updates
Introduction of Agile Measures
› Supports capturing the LOE with this SW dev
methodology
Instructions captured within DID
Format 2 Updates
Clarified SW change count definitions
› Confusion in original DID regarding inclusion of
Information Assurance Vulnerability Alerts (IAVAs)
› Updated instructions to ensure IAVA changes are
mutually exclusive from Priority 1-5 changes, and are
not double counted
Moved SW License Management Hours reporting to
Project Management (PM)
› SW License Management is a PM activity
Development Maintenance
5. 5
Submission Event Process Flow Diagram
Software Resource Data Reporting (SRDR)
Event 1
Initial
Contract
Initial
Event 2
Interim
Event 3
Interim
Event 4
Interim
Complete
Event 5
Interim
Release 1
Release
Initial
Release 2
Contract .
Contract
Interim
Release
Interim
Release 3
Release
Final
Contract
Final
No Change from
previous report
Release 1
Release 2
Release 2
Release 3
Release 4
This reflects
what is
contained in
each individual
submission
This reflects the
individual
submissions
Estimate
In-Process/Actuals
Final/Actuals
Final/Actuals Final/Actuals
Estimate Estimate
Estimate Estimates Estimate
In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals
Release 4
All Release
Reports Due
6. COST ASSESSMENT DATA ENTERPRISE
Software Resource Data Reporting (SRDR)
Presented by:
Paul Kopicki, paul.a.kopicki.civ@mail.mil
7. 7
AGENDA
Software Resource Data Report (SRDR)
› Software DID Overview (Development/Maintenance/ERP)
› Software Data DID CRM Review
› Software Data DID Path Forward
9. 9
Software DID Overview
Software DID Overview (Development/Maintenance/ERP)
› Software Resources Data Reporting:
• Data Item Description: DI-MGMT-82035
• Development, Format 1: DD Form 3026-1
• Maintenance Report, Format 2: DD Form 3026-2
• Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
10. 10
Data Item Description (DID): DI-MGMT-82035
Software DID Overview (Development/Maintenance/ERP)
› DID summarizes the Software (SW) Development Report, Maintenance Report,
and ERP SW Development Report
› Provides instructions to support data and Frequency requirements specified
in the CSDR reporting
› Intent of SRDR is to collect objective, measurable data commonly used by
industry and DoD cost analysts
› Builds a repository of estimated and actual:
• SW product sizes
• Schedules
• Effort
• Quality
11. 11
Development, Format 1: DD Form 3026-1
Software DID Overview (Development/Maintenance/ERP)
Consists of two parts
Part 1 – SW Development Technical Data
› SW size, context, technical information
› Release Level and Computer SW Configuration
Item (CSCI) Level sections
Part 2 – SW Development Effort Data
› Reports SW efforts associated with each
reported release and CSCI
12. Development, Format 1: DD Form 3026-1
Software DID Overview (Development/Maintenance/ERP)
12
Format 1 updates:
› Introduction of Agile Measures
› Instructions captured within DID ex: AAAA 1.x.x 5 3 15 10 200 180
Release Map SECTION 3.3.2.6.5
Planned and Achieved Development
(by Feature per Epic) SECTION 3.3.2.6.6
Epic/
Capability
Identifier
Feature Identifier
Planned Stories
per Feature by
Epic
Actual Stories
Planned Story
Points per Feature
by Epic
Actual
Story
Points
Planned Hours
per Feature by
Epic
Actual
Feature
Hours
(NOTE: Insert rows as needed to account for all Features and Epics mapped to this Release.)
Summary Totals for This Release SECTION
3.3.2.6.7
(Sum of all rows with data by Feature by Epic)
Quantity
Actually
Developed
Quantity
Planned
Item
Agile Measures Based: SECTION 3.3.2.6.4 Days per Sprint:
Days per Release:
Total Features
Total Epics/
Capabilities
Total Stories
Total Story
Points
Total Feature
Hours
Total Sprints
No Additional Significant Updates to Format 1
13. 13
Maintenance Report, Format 2: DD Form 3026-2
Software DID Overview (Development/Maintenance/ERP)
Consists of two parts
Part 1 – SW Maintenance Technical Data
› SW size, context, technical information
› Top Level and Release Level sections
Part 2 – SW Maintenance Effort Data
› Reports the to-date SW maintenance efforts
for each in-progress and completed release(s),
and total maintenance activities
14. 14
Maintenance Report, Format 2: DD Form 3026-2
Software DID Overview (Development/Maintenance/ERP)
Format 2 updates
› Clarified SW change count definitions
• Confusion in original DID regarding inclusion of Information Assurance Vulnerability Alerts (IAVAs)
• Updated instructions to ensure IAVA changes are mutually exclusive from Priority 1-5 changes,
and are not double counted
› Moved SW License Management Hours reporting to Project Management (PM)
• SW License Management is a PM activity
No Additional Significant Updates to Format 1
15. 15
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
SW Development Report specifically for ERP or Defense Business System programs
› Financial
› Personnel
› Inventory
Consists of two parts
Part 1 – System Technical Data
› Provides context, SW product, Development
Team, etc.
Part 2 – SW Development Effort Data
› Project resource and schedule information at th
Release Level
16. 16
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 1 – Technical Data
A. Report Context 3.3.1
Description of Actual Development Organization
B. Product and Team Description 3.3.2
Microcode and Firmware Communication Software Tools
Signal Processing System Software Mission Planning
Vehicle Payload Process Control Custom AIS Software
Vehicle Control Scientific and Simulation Enterprise Service System
Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System
Command and Control Training
10. Primary Application Domain comments 3.3.2.3
11. Comments on Subsection B responses: 3.3.2.4
3. Certification Date: 3.3.1.4 4. Lead Evaluator of CMM Certification:
3.3.1.5
5. Affiliation to Development Organization: 3.3.1.6
ENTERPRISE RESOURCE PLANNING (ERP) SOFTWARE RESOURCES DATA REPORTING, FORMAT 3:
PART 1 Software Development Technical Data (Project) SECTION 3.3
1. Release Name: 3.3.1.1 Release ID: 3.3.1.2
2. Certified CMM Level (or equivalent): 3.3.1.3
8. Project Functional Description: 3.3.2.1
9. Primary Application Domains for ERP Programs 3.3.2.2
6. Precedents (list up to ten similar systems completed by the same organization or team): 3.3.1.7
7. Comments on Subsection A responses: 3.3.1.8
17. 17
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 1 – Technical Data
18. 18
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
19. 19
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
20. 20
Enterprise Resource Planning (ERP), Format 3: DD Form 3026-3
Software DID Overview (Development/Maintenance/ERP)
Part 2 – SW Development Effort Data
22. SRDR DID & Forms CRM Review
Software Data DID CRM Review
22
Document # of Comments Taking Action No Action
General SRDR
DID/Form
20 7 13
Maintenance
Form
11 7 4
ERP Form 6 2 4
37 16 21
7 8
2
4
2
10
4
0
2
4
6
8
10
12
14
16
18
20
DASA-CE LM SSC-SEIT MDA/CP NAVAIR
Taking Action No Action
11
2
18
6
37 Comments
Several rounds of comments through Gov. and Industry
Last round of DID and Forms sent in April 17;
Reviewed/Adjudicated comments with SRDR WG through early May
› Open comments coordinated with developers and
closed out with POCs end of May
23. SRDR DID & Forms CRM Review
Software Data DID CRM Review
23
7 8
2
4
2
10
4
0
2
4
6
8
10
12
14
16
18
20
DASA-CE LM SSC-SEIT MDA/CP NAVAIR
Taking Action No Action
11
2
18
6
37 Comments
Major DID Changes since May 2017
› Agile metrics reporting
• From: All agile programs
instructed to use ERP instructions
and submit agile data using
Format 3
• To: Format 1 was updated to
include Agile table too
› New version of Figure 1 to clarify intent
25. 25
Agile Measures
Software Data DID Backup
For clarify: Cross references from Form 1 to Form 3 were eliminated by moving the Agile Measures section
into Form 1
Comment Proposed Solution
For clarity and consistency, update the Traditional
Development Form and DID language to include the Agile
Measures section that is currently found within the ERP section
of the DID.
Updated the Forms and DID to duplicate the ERP agile
measures language and requirement within the Development
form. Agile measures is now reflected on both forms and both
sections in the DID
26. 26
Submission Event Process Flow Diagram
Software Data DID Backup
Comment Proposed Solution
Text contradicts the process flow diagram (Figure 1) found within
the DID. Provide clarification to update the text and image, so it is
in alignment with reporting (initial, interim, final) deliverables and
expectations.
Updated Figure 1 and associated text.
Notional submission event diagram and associated text was confusing
To avoid confusion, Figure 1 was updated (next slide)
27. 27
Submission Event Process Flow Diagram
Software Data DID Backup
Event 1
Initial
Contract
Initial
Event 2
Interim
Event 3
Interim
Event 4
Interim
Complete
Event 5
Interim
Release 1
Release
Initial
Release 2
Contract .
Contract
Interim
Release
Interim
Release 3
Release
Final
Contract
Final
No Change from
previous report
Release 1
Release 2
Release 2
Release 3
Release 4
This reflects
what is
contained in
each individual
submission
This reflects the
individual
submissions
Estimate
In-Process/Actuals
Final/Actuals
Final/Actuals Final/Actuals
Estimate Estimate
Estimate Estimates Estimate
In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals In-Process/Actuals
In-Process/Actuals In-Process/Actuals
Release 4
All Release
Reports Due
28. DID Comments Summary
Software Data DID Backup
28
Of the 37 comments….
› Comments bucketed into 7 categories
› Majority of comments were for
additional Clarification or Questions
› Comments were not received
differentiating between critical,
substantive, or administrative
18
10
3 3
1 1 1
0
2
4
6
8
10
12
14
16
18
20
# of Comments
Clarification……………………….…………. 18
Question………………………………….…… 10
Administrative…………………………..…. 3
Process Diagram…………………………... 3
Duplication Comment…………………… 1
Not Actionable………………….………….. 1
General..…………..……………..…………… 1
37 Comments
29. 29
Development, Format 1: DD Form 3026-1, Initial Report, Common Heading- Faked
Software Data DID Backup
PRIME MISSION PRODUCT AH-95 Karuk
PHASE/MILESTONE: REPORTING ORGANIZATION TYPE
Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR
A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR
GOVERNMENT
PERFORMING ORGANIZATION DIVISION
a. NAME: Stark Industries a. NAME: Stark Aircraft
b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd.
Orlando, FL Orlando, FL
APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER
d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01
REPORT TYPE INITIAL X INTERIM FINAL
SUBMISSION NUMBER 2
X RDT&E RESUBMISSION NUMBER 0
PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930
X O&M DATE PREPARED (YYYYMMDD)
DEPARTMENT
REMARKS:
20121130
b. END DATE (YYYYMMDD):
c. SOLICITATION NO.: N/A
b. MODIFICATION NO.: 1
APPROPRIATION
N/A
20170930
TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001
PERIOD OF PERFORMANCE
20120501
a. START DATE (YYYYMMDD):
Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com
POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS
UNCLASSIFIED
SECURITY CLASSIFICATION
All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS
elements.
Appropriation is is both RDT&E and O&M (same as associated CSDR)
Software Development Report
UNCLASSIFIED
SECURITY CLASSIFICATION
SOFTWARE RESOURCES DATA REPORTING: Metadata
MAJOR PROGRAM NAME: AH-95 Karuk
The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including
suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09,
Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of
information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION.
OMB Control Number 0704-0188
Expiration Date: 8/31/2016
30. 30
Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked
Software Data DID Backup
1.0
Release Start Date 20120501 Release End Date
Hours Per Staff Month Computed ? No
1.1.2.1 Mission Computer Update Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X
1.1.2.1 Smart Display Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD X SW Con X SW I X SW QT X SW SP X
1.2.1 Software Systems Engineering Definition
ISO 12207 Processes included… SW RA X SW AD X SW DD SW Con SW I SW QT SW SP
1.3.1 Software Program Management Definition
ISO 12207 Processes included… SW RA X SW AD SW DD X SW Con X SW I SW QT X SW SP
1.4.1.1 Development Test and Evaluation (SW-Specific) Definition N/A - Regression Testing is included as part of the Development Activities
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
1.5.1 Training (SW-Specific) Definition N/A - Training is part of Program Management
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
1.6.2.1 Engineering Data (SW-Specific) Definition N/A - Data is provided in the comments of the Mission Computer Update
ISO 12207 Processes included… SW RA SW AD SW DD SW Con SW I SW QT SW SP
Contractor-Defined Software
Development Activities
LEAD EVALUATOR
CMMI Level 3
CERTIFICATION DATE
SOFTWARE PROCESS
MATURITY
20110923
The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It
provides the central computation and control facility that integrates the avionics suite into a cohesive system. The Mission Computer
DEVELOPMENT ORGANIZATION
CMMI Institute
James Rhodes
EVALUATOR AFFILIATION
SOFTWARE DEVELOPMENT REPORT, FORMAT 1: Part 1 Software Development Technical Data, Release Level
Release ID Release Name Mission Computer Update 2018 Baseline
20130930
Software Requirements Count Definition
Total Labor Hours Total Staff
135000
The Software Requirements count is defined by counting the number of "Shall" statements in the latest Software Requirements Specification (SRS). This includes func
External Interface Requirements Count Definition The External Interface Requirements count is defined by counting the number of unique Interface statements in each Subsystem Interface Description Document (IDD).
154 112
Software-Specific Common Elements
(as defined in the CSDR Plan)
Product Quality Reporting Definition
The Smart Display (SD) is a software application that provides flight instrument graphics. It is used in cockpit display systems to
support flight and mission activities.
Program Management performed by Stark Industries in Orlando, FL consists of team management and training utilizing spiral
development strategies through the life of the program
Systems Engineering performed by Stark Industries in Orlando, FL consists of software requirements and initial design efforts
31. 31
Development, Format 1: DD Form 3026-1, Initial Report, Release Level - Faked
Software Data DID Backup
Aerospace UCC Version
Alternate Code Counter Description
DD Form3026-1, MAY 2016
UNCLASSIFIED
SECURITY CLASSIFICATION
RELEASE-LEVEL COMMENTS
The precedents listed above are similar projects in that they all share the engineers and development team. They also use the same development, lab environments and software tools.
Alternate Code Counter Comparison to UCC
Out of a comparison of 381 source files, the UCC tool counted 1.3% more
logical lines of source code than Code Counter Plus.
Third-Party Code Counter
N/A Alt Code Counter Version 2.0
Flight Armor Software Mark 3
Automobile Flight Software Mark 21
Personal Helicopter Mission Computer
Alternate Code Counter Name Code Counter Plus
SYSTEM DESCRIPTION The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air Combat.
This software is an update to the Mission Computer for the helicopter.
PRECEDENTS (List at least three similar systems.)
32. 32
Development, Format 1: DD Form 3026-1, Initial Report, CSCI Level - Faked
Software Data DID Backup
Product and Development Description
Functional Description
Software Development Characterization
Software State of Development (Check one only ) Prototype Production-Ready X Mix
Operating Environment(s) (Check all that apply )
Surface Fixed Surface Vehicle Ordnance Systems Other
Surface Mobile X Air Vehicle Missile Systems If Other, provide explanation
Surface Portable Sea Systems Space Systems
X Manned Unmanned
Primary Application Domain (Check one only )
Microcode and Firmware Communication Software Tools
Signal Processing X System Software Mission Planning
Vehicle Payload Process Control Custom AIS Software
Vehicle Control Scientific and Simulation Enterprise Service System
Other Real-Time Embedded Test, Measurement, and Diagnostic Equipment Enterprise Information System
Command and Control Training
Application Domain Comments
Development Process
Software Development Method
This product is a new development No This is a modification of an existing Mission Computer
N/A - The program is currently in requirements development phase; Maintenance will begin immediately upon deployment
The Mission Computer Update provides the control and operator interfaces for the AH-95 mission systems and subsystems. It provides the central computation and control facility that integ
Spiral
The methodology used for Software Development is the standard Spiral development process that has been used previously. The Software Development team will receive software requireme
33. 33
Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked
Software Data DID Backup
Software Reuse
Name Description
Name Description etc.
COTS / GOTS / Open-Source Applications
Name Glue Code Estimated Configuration Effort Estimated
Name Glue Code Estimated Configuration Effort Estimated etc.
COTS/ GOTS Comments
STAFFING (Initial and Interim Reports only )
Peak Staff Peak Staff Date
Product and Development Description Comments
Product Size Reporting
Software Requirements Count
Total 23854 Inherited Requirements 13502 4999 Modified Requirements 0
Deleted Requirements 0 Deferred Requirements 1535 Requirements Volatility
Certification and Accreditation Rqts Security Requirements 3250 Safety Requirements 568 Privacy Requirements 0
Software Requirements Comments
No requirements are implemented at this time, as this is the Initial Report and no work has yet been done.
External Interface Requirements Count
Total 1585 Inherited Requirements 950 270 Modified Requirements
Deleted Requirements Deferred Requirements 335 Requirements Volatility
Certification and Accreditation Rqts Security Requirements 30 Safety Requirements Privacy Requirements
External Interface Requirements Comments
The previous Mission Computer will be the baseline for the Upgrade
Mission Computer
Added/New Requirements
Added/New Requirements
167 20130101
This is an Upgrade to the existing Mission Computer, so while there will be new development, it is not a new program
No requirements are implemented at this time, as this is the Initial Report and no work has yet been done.
N/A
34. 34
Development, Format 1: DD Form 3026-1, Initial Report CSCI Level - Faked
Software Data DID Backup
SLOC-Based Software Size
Primary Language (L1)
Secondary Language (L2)
L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other
0 0 0
0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF%
0 0 0
0 0 0 0% 0% 0% 0%
0 0 0
766321 105230 0 0% 0% 766321 105230 0% 0%
0 0 0
0 0 0 0% 0% 0% 0%
766321 105230 0 766321 105230 0 0 0 0
0 0 0
L1 L2 Other PRIME ONLY L1 L2 Other ALL SUBCONTRACTORS L1 L2 Other
135268 1795 0 135268 1795
0 0 0 DM% CM% IM% AAF% DM% CM% IM% AAF%
0 0 0
0 0 0 0% 0% 0% 0%
0 0 0
766321 105230 0 0% 0% 766321 105230 0% 0%
0 0 0
0 0 0 0% 0% 0% 0%
901589 107025 0 901589 107025 0 0 0 0
0 0 0
AMOUNT OF DELIVERED CODE DEVELOPED NEW
ACTUAL COUNTS TO DATE (Interim and Final Reports only )
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE DEVELOPED NEW
AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS
RELEASE)
AMOUNT OF GOV'T FURNISHED CODE
AMOUNT OF DELETED CODE
AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT
CARRYOVER FROM PREVIOUS RELEASE)
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE CARRYOVER (i.e., REUSED FROM PREVIOUS
RELEASE)
TOTAL DELIVERED CODE
AMOUNT OF DELETED CODE
WITH
MODIFICATIONS
AUTO
GENERATED
HUMAN
GENERATED
AUTO
GENERATED
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
AMOUNT OF DELIVERED CODE REUSED FROM EXTERNAL SOURCE (i.e., NOT
CARRYOVER FROM PREVIOUS RELEASE)
HUMAN
GENERATED
AMOUNT OF GOV'T FURNISHED CODE
WITHOUT
MODIFICATIONS
TOTAL DELIVERED CODE
ESTIMATES AT COMPLETION (Initial and Interim Reports only )
WITH
MODIFICATIONS
WITHOUT
MODIFICATIONS
35. 35
Maintenance Report, Format 2: DD Form 3026-2, Initial Report Common
Heading - Faked
Software Data DID Backup
PRIME MISSION PRODUCT AH-95 Karuk
PHASE/MILESTONE: REPORTING ORGANIZATION TYPE
Pre-A X B C - FRP X PRIME/ASSOCIATE CONTRACTOR
A C - LRIP O&S DIRECT-REPORTING SUBCONTRACTOR
GOVERNMENT
PERFORMING ORGANIZATION DIVISION
a. NAME: Stark Industries a. NAME: Stark Aircraft
b. ADDRESS: 123 Stark Blvd. b. ADDRESS: 123 Stark Blvd.
Orlando, FL Orlando, FL
APPROVED PLAN NUMBER A-10-C-C25 CUSTOMER N/A
d. NAME: Prototype Development e. TASK ORDER/DELIVERY ORDER/LOT NO.: 01
REPORT TYPE INITIAL X INTERIM FINAL
SUBMISSION NUMBER 2
X RDT&E RESUBMISSION NUMBER 0
PROCUREMENT REPORT AS OF (YYYYMMDD) 20120930
X O&M DATE PREPARED (YYYYMMDD)
DEPARTMENT
REMARKS:
UNCLASSIFIED
SECURITY CLASSIFICATION
SOFTWARE RESOURCES DATA REPORTING: Metadata
MAJOR PROGRAM NAME: AH-95 Karuk
The public reporting burden for this collection of information is estimated to average 16 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including
suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Executive Services Directorate, Directives Division, 4800 Mark Center Drive, East Tower, Suite 02G09,
Alexandria, VA 22350-3100 (0704-0188). Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of
information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR COMPLETED FORM TO THE ABOVE ORGANIZATION.
OMB Control Number 0704-0188
Expiration Date: 8/31/2016
TYPE ACTION a. CONTRACT NO: WQ5935-12-D-0001 b. MODIFICATION NO.: 1 c. SOLICITATION NO.: N/A
PERIOD OF PERFORMANCE APPROPRIATION
a. START DATE (YYYYMMDD): 20120501
b. END DATE (YYYYMMDD): 20121130
POC NAME (Last, First, Middle Initial) TELEPHONE (Include Area Code) EMAIL ADDRESS
20170930
UNCLASSIFIED
SECURITY CLASSIFICATION
Potts, Pepper Stark Aircraft - Software Division 123-456-7890 ppotts@starkindustries.com
All software development and maintenance effort is under WBS element 1.1.2.1 Custom Software Design. There is no software development or maintenance under the other WBS
elements.
Appropriation is is both RDT&E and O&M (same as associated CSDR)
36. 36
Maintenance Report, Format 2: DD Form 3026-2, Initial Report Part 1 - Faked
Software Data DID Backup
System Description
No. of Unique Baselines Maintained 1 - Mission Computer Update 2018 Baseline
No. of Total Hardware Platforms This Software Operates On
X Extensive Regular Event Driven Other
If Other, provide explanation N/A
MAINTENANCE ORGANIZATION
James Rhodes
CERTIFICATION DATE 20110923
LEAD GOVERNMENT ORGANIZATION
LOCATION (Enter the lead government maintenance location)
Software Requirements Count Definitions
External Interface Requirements Count Definitions
The AH-95 Karuk is an Attack Helicopter (AH) with the capability to perform the primary missions of Infantry Support and Close Air
Combat.
This software is an update to the Mission Computer for the helicopter.
4. The Mission Computer (MC) Block C and D hardware.
The Flight Management Computer (FMC) Block B and C hardware.
PRECEDENTS (List at least three similar systems by the same organization or team.)
Flight Armor Software Mark 3
Automobile Flight Software Mark 21
Personal Helicopter Mission Computer
GOVERNMENT
ORGANIZATION NAME
Army Aviation Command
Suite 235
BLDG H-21
Huntsville, AL
The Software Requirements count is defined by counting the number of "Shall" statements in the
latest Software Requirements Specification (SRS). This includes functional, non-functional,
security, safety, privacy and cyber security requirements.
The External Interface Requirements count is defined by counting the number of unique Interface
statements in each Subsystem Interface Description Document (IDD). This includes functional,
non-functional, security, safety, privacy and cyber security requirements.
SOFTWARE MAINTENANCE REPORT, FORMAT 2: Top Level PART 1 Software Maintenance Technical Data
Operation Tempo (check one)
SOFTWARE PROCESS
MATURITY
CMMI
Maturity Level 3
LEAD EVALUATOR EVALUATOR AFFILIATION CMMI Institute
37. COST ASSESSMENT DATA ENTERPRISE
SURF Process Summary & Initial Findings:
A Deeper Focus on Software Data Quality
This document was generated as a result of the AFCAA-led, Software Resource Data Report Working Group (SRDRWG). This working group represented a joint effort amongst all DoD service cost agencies. The following guidance describes SRDR data
verification and validation best practices as documented by NCCA, NAVAIR 4.2, AFCAA, ODASA-CE, MDA, and many more.
Presented by:
Ranae Woods, AFCAA, ranae.p.woods.civ@mail.mil
Dan Strickland, MDA , daniel.strickland@mda.mil
Nicholas Lanham, NCCA, nicholas.lanham@navy.mil
Marc Russo, NCCA, marc.russo1@navy.mil
Haset Gebre-Mariam, NCCA,
haset.gebremariam@navy.mil
38. 38
Table of Contents
SURF Process Summary & Initial Findings
› Purpose
› SRDR Need Statement
› SURF Purpose
› SURF Created Process Initiation
› SURF Team Structure
› SURF Verification & Validation (V&V) Guide
› SRDR V&V Process
› SRDR Database
› SRDR Data Quality Review
› SURF Status and Metrics
› Summary
39. 39
Presentation Purpose
SURF Process Summary & Initial Findings
› To familiarize the audience with recent Software Resource Data Report
(SRDR) Working Group (WG) efforts to update existing SRDR DID language
and implement data quality improvement
› To clarify how these SRDRWG efforts led to the development of a SRDR
Unified Review Function (SURF) team
› To highlight:
− SURF mission
− Highlight SURF team and Verification and Validation (V&V) guide positive impact
on SRDR data quality
40. 40
SURF Need Statement
SURF Process Summary & Initial Findings
Why do these reports need to be reviewed?
› Reduces inaccurate use of historical software data
– Aligns with OSD CAPE initiative(s) to improve data quality
› Helps correct quality concerns prior to final SRDR acceptance
› Allows a central group of software V&V SMEs to tag SRDR data
› SRDR submissions are used by all DoD cost agencies when developing or assessing cost
estimates
› Quality data underpins quality cost and schedule estimates
BBP Principle 2: Data should drive policy. Outside my door a sign is posted that reads, "In God We Trust; All Others
Must Bring Data." The quote is attributed to W. Edwards Deming
- Mr. Frank Kendall, AT&L Magazine Article, January-February 2016
41. Question: What services helped develop the questions included within the latest SRDR V&V guide?
Answer: All services participating in the SRDR WG provided feedback, comments, and reviews over a year long
SRDRWG effort focused on establishing higher quality review efforts coupled with an ongoing SRDR DID update 41
SURF Purpose
SURF Process Summary & Initial Findings
Purpose
› To supplement the Defense Cost Resource Center (DCARC) quality review for SRDR submissions
› To develop a consistent, service-wide set of quality questions for all DoD cost community members
to reference
› To provide a consistent, structured list of questions, focus areas, and possible solutions to cost
community members tasked with inspecting SRDR data submissions for completeness, consistency,
quality, and usability (e.g. SRDR V&V Guide)
Why?
› SURF represents an effort to establish a consistent guide for any organization assessing the realism,
quality, and usability of SRDR data submissions
› Quality data underpins quality cost and schedule estimates
42. Question: How was the SURF team created and is it linked to the SRDRWG?
Answer: Yes. The SRDR Unified Review Function (SURF) team was organized as part of the larger, SRDRWG initiative
during 2015 42
How Was SURF Created?
SURF Process Summary & Initial Findings
1. Revised SRDR Development Data Item
Description (DID)
2. New SRDR Maintenance Data Item
Description (DID)
3. Joint Validation & Verification (V&V)
Guide, Team, and Process
4. Software Database Initial Design and
Implementation Process
1. Reduces inconsistency, lack of visibility,
complexity, and subjectivity in reporting
2. Aligned w/ dev. but w/ unique data/metrics
available/desired for maintenance phase
3. Higher quality, less duplication - ONE central
vs many distributed; 1 joint team & guide
gives early, consistent feedback to ktrs
4. Avoids duplication, variations - ONE central
vs many distributed; Based on surveyed
best practices and user expectations
Recommendation Benefit
43. Question: How do members get involved with SURF? Why are there “primary” and “secondary” members?
Answer 1: The SURF team was established by Government SRDRWG members who were recommended/volunteered by each DoD service
Answer 2: Primary members are included on CSDR S-R IPT email notifications for their specific service. Secondary members are contacted during
periods of increased review demands, if necessary.
SURF Secondary:
SURF Primary:
DCARC Analyst & STC:
DoD
William
Raines
Navy
Corrinne Wallshein
Wilson Rosa
Stephen Palmer
Philip Draheim
Sarah Lloyd
Marine Corps
Noel Bishop
John Bryant
Air Force
Ron Cipressi
Janet Wentworth
Chinson Yew
Eric Sommer
Army
Jim Judy
Jenna Meyers
James Doswell
Michael Smith
Michael Duarte
SPAWAR
Jeremiah
Hayden
Min-Jung
Gantt
MDA
Dan
Strickland
43
SURF Team Structure
SURF Process Summary & Initial Findings
› Team is comprised of one primary member per service along with support from secondary team members
(Government Only)
› As submissions are received, SRDR review efforts will be distributed amongst SURF team members to
balance workload
› SURF Team Coordinators (STC): Marc Russo & Haset Gebre-Mariam
› Current SURF structure:
SURF Team Coordinators (STC)
Marc Russo
Haset Gebre-Mariam
SURF Advisor & Process Owner
(SAPO)
Nick Lanham
SRDR Submission received from
DCARC
44. Question: Did a standardized-joint service, software-specific quality review guide exist prior to the SURF V&V guide? Who contributed to the
development of this document?
Answer 1: No. Services implemented very inconsistent SRDR review methodologies (if conducted at all) prior to DCARC acceptance
Answer 2: The SRDR V&V guide was developed by the SURF team and has been reviewed by numerous SRDRWG, OSD CAPE, and other cost
community team members. Feedback from other services has generated significant improvements from initial draft.
44
SRDR V&V Guide
SURF Process Summary & Initial Findings
Guide represents first-ever, joint effort amongst DoD
service cost agencies
› OSD public release approved 5 April 2016
› Kickoff email distributed on 1 May 2017 to
update guide with latest DID requirements
› Files can be downloaded using following link:
http://cade.osd.mil/roles/reviewers#surf
Enables ability to consistently isolate software cost
relationships and trends based on quality SRDR data
› Now includes quick-reference MS excel question
checklist by SRDR DID section
Two main purposes:
› SRDR V&V training guide (V&V questions)
› Focus areas used to determine SRDR quality tags
45. V&V Questions and Examples Developed and Organized by Individual SRDR
reporting Variable 45
SRDR V&V Guide Table of
Contents (TOC)
SURF Process Summary & Initial Findings
1.0 Review of an SRDR submitted to DCARC
1.1 Reporting Event
1.2 Demographic Information
1.3 Software Char. and Dev. Process
1.3.1 Super Domain and Application Domains
1.3.2 Operating Environment (OE) Designation
1.3.3 Development Process
1.4 Personnel
1.5 Sizing and Language
1.5.1 Requirements
1.5.2 Source Lines of Code (SLOC)
1.5.3 Non-SLOC Based Software Sizing
1.5.4 Product Quality Reporting
1.6 Effort
1.7 Schedule
1.8 Estimate at Completion (EAC) Values
2.0 Quality Tagging
3.0 Solutions for Common Findings
3.1 Allocation
3.2 Combining
3.3 Early Acquisition Phase Combining
4.0 Pairing Data
5.0 Possible Automation
Appendix A – SD and AD Categories
Appendix B – Productivity Quality Tags
Appendix C – Schedule Quality Tags
Appendix D – SRDR Scorecard Process
46. V&V Guide Includes Specific Questions For SURF Members to Confirm Prior to
Accepting the Report 46
SRDR V&V Guide Example Questions
SURF Process Summary & Initial Findings
– Was effort data reported for each CSCI or WBS?
– Was effort data reported as estimated or actual results? If the submission includes estimated values
and actual results, does the report include a clear and documented split between actual results and
estimated values?
– Is the effort data reported in hours?
– Is effort data broken out by activity?
– What activities are covered in the effort data? Is there an explanation of missing activities included
within the supporting SRDR data dictionary? ….
When assessing Effort, the V&V priority is determining completeness
Determining completeness is not always easy due to:
– The contractor possibly collecting/reporting their actual performance using categories that differ
from the IEEE 12207 standard
– The contractor reporting all of their Effort within the Other category
Common questions to ask when looking at the effort are:
Section 1.6: Effort
47. DCARC: Step 1
•SRDR status list
sent to SURF Team
Coordinator
SURF: Step 1
•SRDR status list
distributed to
Primary and
Secondary POCs
SURF: Step 2
•Conduct V&V
reviews by
populating MS
Excel question
template
SURF: Step 3
•Provide completed
V&V question
templates back to
DCARC
DCARC: Step 2
•Combine SURF and
DCARC comments
•Coordinate
comment
resolution with
submitting
organization
Database: Step 1
•Adjudicated SRDR
sent to NAVAIR 4.2
for data entry into
DACIMs dataset
•Note: Future
database(s) will be
hosted via CADE
Varies by No.
Submissions
47
SURF Team V&V Process
SURF Process Summary & Initial Findings
1st week of
every month
+2 Days + 13 Days
NLT
+ 14 Days
Varies by
Contractor
Purpose of SURF Process: To provide completed V&V checklists to DCARC within 2 weeks of request
Important Note: CADE is developing relational databases for new DID formats. Over time, data entry will be automated.
Until that time, manual data entry continues by NAVAIR 4.2 team for only the development format. Please refer to V&V
guide for additional automation details and future data quality initiatives
Monthly Recurring SURF and DCARC Communications
48. 48
SRDR Database Location
SURF Process Summary & Initial Findings
› Login to CADE and click on
“CADE Portal Login”
− http://cade.osd.mil/
› Select “My CADE” from the
menu, and click on “CADE
Library”
› Use to Advanced Search and
filter on “Container Types”
under “Software Data”
Question: Where does SRDR data go after SURF Review?
Answer: Once SRDR record has been accepted, Data is entered into SRDR dataset posted to CADE>DACIMs web portal
Question: Who enters the data into the dataset?
Answer: Currently members from NAVAIR 4.2 enter data to SRDR dataset (10+ years of experience). Future data entry is
planned to be automated using .XML schemas linked to latest DID formats
49. › SRDR database is available to Government analysts with access to the CADE portal
– This dataset is the authoritative source for SRDR data (10+ years of uploads)
› Data is not automatically considered “Good” for analysis
› SURF team may recommend DCARC not accept a submission due several data quality concerns
outlined in the V&V guide. Examples include:
– Roll-up of lower level data (Did not want to double count effect)
– Significant missing content in hours, productivity, and/or SLOC data missing
– Interim build actual that is not stand alone
– Inconsistencies or oddities in the submit
– Additional reasons discussed in the V&V guide
49
SRDR Data Quality Review
SURF Process Summary & Initial Findings
Data Segments Dec-07 Dec-08 Oct-10 Oct -11 Aug-13 Apr-14 Apr-15 Dec-16
CSCI Records 688 964 1473 1890 2546 2624 2853 3487
Completed program or build 88 191 412 545 790 911 1074 1326
Actuals considered for analysis (e.g., “Good”) 0 119 206 279 400 403 682 829
Paired Initial and Final Records 0 0 78 142 212 212 212 240
Dataset Posted to OSD CAPE DACIMS Web Portal
50. SURF Team Combined With V&V Guide and DCARC
Have Significantly Improved Software Data Quality
› Prior to SURF process, only 15% of SRDR data was considered “Good”
› After one+ year of SURF reviews, ~24% of data has been tagged as “Good”
› Army team currently working to review historical data. Once completed, “Good” percentage
will likely increase to ~31%
50
SRDR Data Quality Review
SURF Process Summary & Initial Findings
1890
2546 2624
2853
3487
279
400 403
682
829
0
500
1000
1500
2000
2500
3000
3500
4000
11-Oct 13-Aug 14-Apr 15-Apr 16-Dec
Total Records (e.g., CSCIs) Actuals considered for analysis (e.g., “Good”)
2011-2017 Trend Analysis
51. 51
SURF Team Status
SURF Process Summary & Initial Findings
› Recurring SURF team meetings kicked off on 23 June 2015
– Group includes ~19 Government team members from across the DoD
– Has received very positive feed back from DoD cost estimation community, DCARC analyst(s), and even program office
communities since inception
– Completed initial version of SRDR V&V guide March 2015
– Initiated SURF Initial team training using draft V&V guide June 2015
– Completed development of SURF team charter July 2015
– During training period, SURF generated 483 V&V comments provided to DCARC (June 2015 to March 2016)
– Completed official SURF kickoff with DCARC and published V&V guide March 2016
– After training period, formal SURF process generated 889 V&V comments (March 2016 to December 2016)
– Concluding CY16, SURF team generated 1,372 V&V comments from 92 SRDR submissions (1,282 during CY16)
› Current Status
– CY17 represents first full year of official SURF reviews using published V&V guide
– Team recently kicked off effort to update existing V&V questions to align with the latest SRDR DID
– Co-chaired Collaborative Cost Research Group (CCRG) focused on increasing “Good” SRDR records (March 2017)
– Continued process improvement efforts to maintain efficient and effective process
– Working with DCARC to develop SURF User Interface within CADE
V&V Comments Have Significantly Improved SRDR Data Quality
Key Facts As of June 2017
52. 52
SURF Comments by V&V Category
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
53. 53
SURF Team Findings by V&V Category
SURF Process Summary & Initial Findings
› V&V comments are generated when SURF members answer a question with “No”
› Common trends for “No” responses:
– Reports not including all types of requirement counts (e.g., new, modified, inherited, deleted, cybersecurity, etc.)
– Reports not including COTS/GOTS “glue code” Software Lines of Code (SLOC) totals
– Reports not including SLOC counts using USC Unified Code Count (UCC) tool
– Reports not including software defect counts
– Reports not including a subset of required metadata (e.g., TD, EMD, Prototype, Service, Milestone, Contract Type, etc.)
› Common trends for “Yes” responses:
– Reports include a subset of metadata (e.g., contractor, contract number, report type, report POC, program name, etc.)
– Reports typically have SLOC counts broken out by new, modified, reuse, auto, and deleted
– Reports typically include primary language type designation
– Reports typically include “Effort” broken out by activity
› Common trends for “N/A” responses:
– Reports typically do not include “Forecast At Completion (FAC)” values
– Reports typically do not include non-SLOC sizing metrics (Function Points, RICE-FW, etc.)
– SURF analyst typically does not have access to corresponding CSDR plan (Working with CADE to develop SURF portal)
All Reviews (Mar – Dec 2016)
54. 54
SURF V&V Guide
SURF Process Summary & Initial Findings
Currently, SURF members are updating or creating draft question lists to account for new DIDs for
Development, Maintenance, and ERP
› Updates to the development question lists include improvements to the list from lessons
learned over the previous year
Draft Question lists will then to be sent out to a larger SRDR-focused team members to ensure
questions list are reasonable and that they address quality data concerns
› Important to keep question lists to a reasonable size for continued SURF success
V&V guide and question templates to be updated to incorporate new questions as well as other
lessons learned
Updated V&V to larger SRDR working group and senior management for final comments/feedback
Send Updated V&V guide to OSD for final PAO approval and posting to CADE
Process for Update
55. 55
SURF Summary
SURF Process Summary & Initial Findings
SURF is focused on improving data quality and helping support robust Government
review process
We would like to thank all of the DoD and Non-DoD individuals who have commented,
participated, and provided feedback throughout the past few years
Please feel free to use the contact information below if you would like more
information regarding SURF, the SRDR V&V Guide, or checklist
Marc Russo
Naval Center for Cost Analysis (NCCA)
NIPR: Marc.russo1@navy.mil
Ron Cipressi
Air Force Cost Analysis Agency
NIPR: Ronald.p.cipressi.civ@mail.mil
Nicholas Lanham
Naval Center for Cost Analysis (NCCA)
NIPR: Nicholas.lanham@navy.mil
Dan Strickland
Missile Defense Agency (MDA)
NIPR: Daniel.strickland@mda.mil
57. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.5.1.1 Does the submission clearly illustrate the number of Inherited, Added, Modified, Deleted, and Deferred requirements for both internal and external categories? 31 0 1 0
1.5.2.16
If COTS or GOTS items have been included within the submission, has the submitting organization provided the SLOC total required to integrate the identified COTS/GOTS product
(i.e. Glue code)?
28 0 4 0
1.5.1.2 Has the submitting organization separated the provided requirements by Security, Safety, and Privacy or Cybersecurity? 26 1 5 0
1.5.2.4
Did the submitter us the Aerospace-approved version of the University of Southern California (USC) Center for Systems and Software Engineering (CSSE) Unified Code Count (UCC)
tool to count the provided SLOC totals? If not, was the name of the code counting tool used by the submitting organization included within the supporting comments section
and/or data dictionary?
25 6 1 0
1.2.5 Is the system description been included within the submission? 24 8 0 0
1.2.2 Has the Major Defense Acquisition Program (MDAP) or Major Automated Information System (MAIS) designation been listed? 20 2 10 0
1.3.2.3 Has the state of development been identified (For example: Prototype, Production Ready, or a mix of the two)? 19 11 2 0
1.5.4.2 Has the priority level for each category of software defects been provided? 18 9 5 0
1.1.9
Is it clear if the information represents a Technology Demonstration (TD) or Engineering and Manufacturing Development (EMD) phase if the program is in that stage of
development?
17 8 7 0
1.5.2.13 Has the contractor or submitting organization provided the name of the software products that have been referenced to generate the provided reuse SLOC totals? 17 14 1 0
1.5.4.1 Has the submitting organization provided a breakout of the number of software defects Discovered, Removed, and Deferred? 17 10 5 0
1.7.2 Has the submitting organization clearly stated if the provided schedule data was reported as estimated, allocated, or actual results? 16 16 0 0
1.2.16 Is the specific U.S Military service branch or customer identified (For example: Navy, Air Force, Army, prime contractor, etc.)? 15 14 3 0
1.3.1.1 Does the SRDR submission, comments section, or data dictionary include a clear system level functional description and software operational overview? 15 17 0 0
1.2.6 Have the program phase and/or milestone been included within the report (for example: Pre-A, A, B, C-LRIP, C-FRP, O&S, etc.)? 14 18 0 0
1.3.2.1 Does the SRDR data dictionary include a clear system-level functional description and software operational overview? 14 17 1 0
1.2.19 Has the contract Period of Performance (PoP) been identified? 13 19 0 0
1.2.23
Does the submission include adequate detail within the comments section to support analysts who may reference the submission sometime in the future (For example: Provide
context for analyzing the provided data, such as any unusual circumstances that may have caused the data to diverge from historical norms)?
13 18 1 0
1.3.3.3 If an upgrade, does the SW sizing reflect significant reuse or modification SLOC totals when compared to New code? 13 16 2 1
1.4.5
Does the peak headcount make sense against the reported schedule and hours? A simple test is to divide the total reported hours by the schedule months and then convert the
resulting average monthly hours into an average Full Time Equivalent (FTE) count using the reported hours in a man-month. The peak headcount must be higher than this FTE
monthly average. At the same time the peak headcount should not be wildly disproportional to that average either.
13 17 1 1
1.5.2.10 Were code adaptation factors reported (percent redesign, recode, reintegration)? Do they appear to be unique for each CSCI, or are they standard rules of thumb? 13 4 15 0
1.2.17
Has the specific contract type been identified? For contracts, task orders, or delivery orders with multiple CLINs of varying contract types, the Contract Type reporting should be
the one associated with the plurality of cost.
12 20 0 0
1.2.22
Has the funding appropriation been identified (for example: Research, Development, Test and Evaluation (RDT&E), Procurement, Operation and Maintenance (O&M), Foreign
Military Sales (FMS), etc.)?
12 20 0 0
1.2.4 Has the Defense material item category been provided in accordance with MIL-STD-881C guidance (for example: Aircraft, radar, ship, Unmanned Ariel Vehicle (UAV) system)? 12 12 8 0
1.3.3.5 Has the development method also been identified (for example: Structured Analysis, Object Oriented, Vienna Development, etc.)? 12 20 0 0
1.6.2
Was effort data reported as estimated or actual results? If the submission includes estimated values and actual results, does the report include a clear and documented split
between actual results and estimated values?
12 18 2 0
V&V Questions Most Frequently With “No”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
57
58. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.2.12 Is the contract number reported? 0 32 0 0
1.2.20 Has the report type been identified (for example: Initial, Interim, or Final)? 0 32 0 0
1.2.7 Has the contractor or organization that performed the work been identified? 0 32 0 0
1.2.21 Is there a single submission Point of Contact (POC) and supporting contact information included within the report? 3 29 0 0
1.7.1 Has schedule data been included in the submission? 3 29 0 0
1.7.4 Is schedule data broken out by SRDR activity? 3 29 0 0
1.1.2 Does the report reference the CSDR Plan? 4 28 0 0
1.2.1 Has the program name been identified? 4 28 0 0
1.5.2.1 Was the primary programming language reported? 4 28 0 0
1.5.2.6 Are the SLOC counts for different types of code (e.g., new, modified, reused, auto-generated, Government-furnished, and deleted) separated or are they mixed together? 4 28 0 0
1.6.3 Is the effort data reported in hours? 4 28 0 0
1.6.4 Is effort data broken out by activity? 4 28 0 0
1.6.5
Was the specific ISO 12207:2008 activities that are covered in the effort data (For example: Requirements analysis, architectural design, detailed design, construction, integration,
qualification testing, and support processes) clearly discernable?
4 28 0 0
1.1.6 Is there an easily identifiable event associated with the submission (for example: Contract Award, Build 2 Release, Build 1 Complete, Contract Complete, etc.)? 5 27 0 0
1.6.1 Was effort data reported for each CSCI or WBS? 5 27 0 0
1.2.14
Is the software process maturity and quality reporting definition provided (For example: Capability Maturity Model (CMM), Capability Maturity Model Integration (CMMI), or other
alternative rating)?
4 27 1 0
1.7.3 Has schedule data been reported in number of months from contract start or as calendar dates? 3 27 2 0
1.2.13 Are precedents reported and consistent from submission to submission? 1 27 4 0
1.3.3.4 What precedents or prior builds are identified to give credibility to the upgrade designation? 1 27 3 1
1.3.3.2 Has the contractor indicated whether the software is an upgrade or new development? If not, why not? 6 26 0 0
1.4.3 Does the data dictionary define what the skill level requirements are, and is the contractor adhering to that definition? 6 26 0 0
1.2.15 Is the Process Maturity rating reported with an associated date, and has it changed from a prior submission? 3 26 3 0
1.2.10 Has the contractor or submitting organization illustrated whether they were the primary or secondary developer? 7 24 1 0
1.6.6
Were common WBS elements/labor categories such as System Engineering (SE), Program Management (PM), Configuration Management (CM), or Quality Management (QM)
been broken out separately?
7 24 1 0
1.7.5
Does the report include unique schedule start and end date values? For example, do multiple records have the same schedule data, e.g., same calendar dates for multiple
WBS/CSCIs or builds?
7 24 1 0
1.2.3 Is the Prime Mission Product (PMP) name been clearly identified (for example: most current official military designation? 5 24 3 0
V&V Questions Most Frequently With “Yes”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
58
59. Question ID Question from V&V Guide Template N(No) N(Yes) N(N/A) N(No Resp.)
1.5.3.2
If function points have been provided has the submitting organization clearly illustrated the function point count type (For example: Enhancement Project, Application, or
Development Project)?
0 0 32 0
1.5.3.3
Has the submitting organization provided the number of Data Functions and Transactional Functions (For example: Internal Logic Files, External Interface File, External Inquiries,
External Inputs, and External Outputs)?
0 0 32 0
1.5.3.4 Has the submitting organization included the Value Adjustment Factor? 0 0 32 0
1.5.3.5
If the submitting organization has provided sizing metrics using the Reports, Interfaces, Conversions, Extensions, Forms, and Workflows (RICE-FW) convention, has the complexity
of each RICE-FW category been provided?
0 0 32 0
1.8.1 FACH: Has a description been provided that describes which ISO 12207:2008 elements have been included within the provided total? 1 0 31 0
1.8.2 FACH: Do sub-element FAC values sum to the parent FAC total value? 1 0 31 0
1.8.3 If the report is a final report, does the provided ATD total match the provided FAC total? 1 0 31 0
1.1.1 Is the submission compliant with the CSDR Plan, i.e., a comparison of the submission to the plan requirement? 1 1 30 0
1.5.3.1
Were SLOC counts reported, or were other counting or sizing metrics used (e.g. function points, use cases, rung logic ladders, etc.)? If so, has the submitting organization obtained
the appropriate authorization to report non-SLOC based sizing within the corresponding CSDR plan?
0 3 29 0
1.6.17 If subcontractor hours have not been provided, did the reporting organization provide subcontractor dollars? 2 1 29 0
1.5.2.17
If COTS or GOTS integration or glue code has been included within the submission, does the total seem realistic when compared to the total SLOC included in the CSCI or WBS
element (For example: COTS integration code equals 500 KSLOC and the total SLOC for the specific CSCI or WBS element equals 150 KSLOC)? note: this scenario sometime occurs
when the submitting organization counts the total SLOC of the specified COTS or GOTS product vice the integration or glue code required to integrate the product.
3 0 28 1
1.6.9
Do the children or lower-level WBS/CSCI elements add up to the parent? If not, is there effort that is only captured at a higher-level WBS/CSCI level that should be allocated to the
lower-level WBS/CSCI elements?
5 3 24 0
1.2.18 Has the total contract price been identified? 9 0 23 0
1.5.1.3
Do the number of requirements trace from the parent to the children in the WBS? If not, this could imply that some portion of the software effort is only captured at higher-level
WBS/ CSCI elements and should be cross checked.
3 4 23 2
1.5.2.9
For a Final report does the size look realistic? For example: is all of the code rounded to the nearest 1000 lines, or does the dictionary indicate that they had difficulty counting
code that may have come from a subcontractor?
1 9 22 0
1.1.7
If there are prior submissions, is this submission an update to a prior submission or a new event? If the submission is an update to an existing submission, does the latest
submission clearly describe what report the prior submission is linked to?
2 8 21 1
1.2.11 If effort was outsourced, has the outsourced organization been provided? 4 7 21 0
1.6.7 Is there an explanation of missing activities included within the supporting SRDR data dictionary? 7 4 21 0
1.1.8 If a prior submissions exists, is the information that has changed readily identifiable and a reason for the change provided (either in the data dictionary or comments section)? 5 6 20 1
1.4.4 Does the skill mix make sense relative to the complexity of the code (unusual amount of very low or very high skill mix, for example)? 0 12 20 0
1.5.4.3
If the report is an interim or final submission, has the number of Discovered, Removed, and Deferred defects changed from the previous submission? If significant changes have
occurred, does the supporting comments section and/or data dictionary provide details regarding what drove the significant change in product quality metrics?
10 2 20 0
1.6.12
Does the submission include unique values for each of the lower-level CSCI or WBS elements? For example, do multiple related records have the same effort data (i.e. activity
effort is repeated or total effort is repeated)?
6 6 20 0
1.4.2 If there was a prior submission, has the skill mix changed dramatically and, if so, is there an explanation why? Conversely, did it remain unchanged? If so, why? 8 4 19 1
1.5.2.14 When subcontractor code is present, is it segregated from the prime contractor effort, and does it meet the same criteria for quality as the prime’s code count? 6 7 19 0
V&V Questions Most Frequently With “N/A”
Response
SURF Process Summary & Initial Findings
All Reviews (Mar – Dec 2016)
59
Editor's Notes
The reports are not management reports and are not intended for tracking progress
Development
CSCI is the lowest level of software development at which configuration management is performed by the developer. usually indicated by a separate Software Development Folder (SDF), Software Requirements Specification (SRS) etc. The CSDR plan will serve as the authoritative definition for reporting purposes
Maintenance
The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
ERP
Part 1, System Technical Data, provides the report context, describes both the software product being developed and the development team, lists project requirements for modules, business processes, system functions and interfaces, and captures software product size and complexity at the Release Level. Part 2, Software Development Effort Data, captures project resource and schedule information at the Release Level. Format 3 uses the term “release” to refer to commonly used terms such as build, product build, and increment, but this document will use release as the primary term throughout.
Format 2
The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
The reports are not management reports and are not intended for tracking progress
CSCI is the lowest level of software development at which configuration management is performed by the developer. usually indicated by a separate Software Development Folder (SDF), Software Requirements Specification (SRS) etc. The CSDR plan will serve as the authoritative definition for reporting purposes
Introducing Agile Measures supports capturing the LOE with this SW dev methodology
The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
The Top Level Data includes all information applicable to the software maintenance release(s) for the reporting event, defines each of the data elements as required, and describes the methods and rules used to perform the data measurement or estimation. The Release Level Data is used to obtain the actual (as-built) characteristics of the maintenance product and its maintenance process at the Release level.
. Part 1, System Technical Data, provides the report context, describes both the software product being developed and the development team, lists project requirements for modules, business processes, system functions and interfaces, and captures software product size and complexity at the Release Level. Part 2, Software Development Effort Data, captures project resource and schedule information at the Release Level. Format 3 uses the term “release” to refer to commonly used terms such as build, product build, and increment, but this document will use release as the primary term throughout.