SlideShare a Scribd company logo
1 of 41
Download to read offline
CAST Confidential 1
CAST
The Leader in Software Analytics & Risk Prevention
The place of non-functional elements in measurement
Nesma Autumn Conference
November 9, 2017
Philippe-E. DOUZIECH
Principal Research Scientist
e: p.douziech@castsoftware.com
CAST Confidential 2
Agenda
Context presentation
•  Drivers for FSM-related measurement
•  Black box / white box measurement
•  Integrated measurement
Measurement solutions – OMG standards
•  Increase visibility on software size
•  Increase visibility on software-related activity
•  Increase visibility on software quality
•  List of available metrics
Effective sizing metrics
•  Objectives and definitions
•  Samples
Conclusions
CAST Confidential
Drivers for FSM-related measurement
§  Benchmark measure of selected normalized metrics
across a group of application.
§  Trend measure of selected normalized metrics overtime
and aggregate per application.
§  Trend outcome / effort of a specific team (application
team, country, sourcing…)
§  Trend sizing, quality, complexity metrics in correlation to
other metrics (cost, effort, time)
§  Scorecard with breakdown per line of business,
application or custom grouping (in-house or outsourced)
§  Trend of the defined metrics
§  Transparency, quality and risk management, and SLAs
§  Baseline analysis and on-going comparisons on Agility,
responsiveness and Productivity change
§  Sizing, complexity and quality information processed
during a release. Information display per transaction.
§  Standard deviation of effort (estimation vs real vs
measured)
Normalization & Benchmarking
Productivity Measurement & Improvement
Measure Effectiveness of Transformation
Initiative
ADM Supplier Measurement
Optimize ADM Estimation
3
CAST Confidential
Cost	
  
Workload	
  
Black box measurement
Client
•  Baseline and ongoing effort data (e.g. cost,
hours, headcount)
•  Baseline and on-going staffing per release
•  Incident / ticket reports
•  Closed pre-production defects within time
commitment / Total closed pre-production
defects
•  Release schedule
§  Benchmark measure of selected normalized metrics
across a group of application.
§  Trend measure of selected normalized metrics overtime
and aggregate per application.
§  Trend outcome / effort of a specific team (application
team, country, sourcing…)
§  Trend sizing, quality, complexity metrics in correlation to
other metrics (cost, effort, time)
§  Scorecard with breakdown per line of business,
application or custom grouping (in-house or outsourced)
§  Trend of the defined metrics
§  Transparency, quality and risk management, and SLAs
§  Baseline analysis and on-going comparisons on Agility,
responsiveness and Productivity change
§  Sizing, complexity and quality information processed
during a release. Information display per transaction.
§  Standard deviation of effort (estimation vs real vs
measured)
Normalization & Benchmarking
Productivity Measurement & Improvement
Measure Effectiveness of Transformation
Initiative
ADM Supplier Measurement
Optimize ADM Estimation
4
CAST Confidential
Quality	
  
Complexity	
  
Size	
  
Cost	
  
Workload	
  
Black box / white box measurement
Applica8on	
  	
  
source	
  code	
  
Client
•  Measure applications within an IT portfolio to establish a
baseline
•  Measure on-going activity, trends of application size,
complexity and quality characteristics.
•  Normalize size, complexity and quality metrics
•  Benchmark of normalized metrics
•  Automated Function Points (AFP) and
Automated Enhancement Points (AEP)
•  Effort, algorithmic, SQL, object complexity
•  Critical violations and violations with high weight
•  Health Factor trends
•  Baseline and ongoing effort data (e.g. cost,
hours, headcount)
•  Baseline and on-going staffing per release
•  Incident / ticket reports
•  Closed pre-production defects within time
commitment / Total closed pre-production
defects
•  Release schedule
§  Benchmark measure of selected normalized metrics
across a group of application.
§  Trend measure of selected normalized metrics overtime
and aggregate per application.
§  Trend outcome / effort of a specific team (application
team, country, sourcing…)
§  Trend sizing, quality, complexity metrics in correlation to
other metrics (cost, effort, time)
§  Scorecard with breakdown per line of business,
application or custom grouping (in-house or outsourced)
§  Trend of the defined metrics
§  Transparency, quality and risk management, and SLAs
§  Baseline analysis and on-going comparisons on Agility,
responsiveness and Productivity change
§  Sizing, complexity and quality information processed
during a release. Information display per transaction.
§  Standard deviation of effort (estimation vs real vs
measured)
Normalization & Benchmarking
Productivity Measurement & Improvement
Measure Effectiveness of Transformation
Initiative
ADM Supplier Measurement
Optimize ADM Estimation
5
CAST Confidential
§  Define set of density ratios as
a scorecard
§  Trend set of density ratios
§  Benchmark normalized
metrics
§  Establish distribution channel
for new Metrics (CAST &
internal)
§  Develop rollout
communication & training plan
Quality	
  
Complexity	
  
Size	
  
Cost	
  
Workload	
  
Integrated measurement
Applica8on	
  	
  
source	
  code	
  
Client
Practitioner
•  Measure applications within an IT portfolio to establish a
baseline
•  Measure on-going activity, trends of application size,
complexity and quality characteristics.
•  Normalize size, complexity and quality metrics
•  Benchmark of normalized metrics
•  Automated Function Points (AFP) and
Automated Enhancement Points (AEP)
•  Effort, algorithmic, SQL, object complexity
•  Critical violations and violations with high weight
•  Health Factor trends
•  Baseline and ongoing effort data (e.g. cost,
hours, headcount)
•  Baseline and on-going staffing per release
•  Incident / ticket reports
•  Closed pre-production defects within time
commitment / Total closed pre-production
defects
•  Release schedule
§  Benchmark measure of selected normalized metrics
across a group of application.
§  Trend measure of selected normalized metrics overtime
and aggregate per application.
§  Trend outcome / effort of a specific team (application
team, country, sourcing…)
§  Trend sizing, quality, complexity metrics in correlation to
other metrics (cost, effort, time)
§  Scorecard with breakdown per line of business,
application or custom grouping (in-house or outsourced)
§  Trend of the defined metrics
§  Transparency, quality and risk management, and SLAs
§  Baseline analysis and on-going comparisons on Agility,
responsiveness and Productivity change
§  Sizing, complexity and quality information processed
during a release. Information display per transaction.
§  Standard deviation of effort (estimation vs real vs
measured)
Normalization & Benchmarking
Productivity Measurement & Improvement
Measure Effectiveness of Transformation
Initiative
ADM Supplier Measurement
Optimize ADM Estimation
6
CAST Confidential 7
Agenda
Context presentation
•  Drivers for FSM-related measurement
•  Black box / white box measurement
•  Integrated measurement
Measurement solutions – OMG standards
•  Increase visibility on software size
•  Increase visibility on software-related activity
•  Increase visibility on software quality
•  List of available metrics
Effective sizing metrics
•  Objectives and definitions
•  Samples
Conclusions
CAST Confidential 8
What are the Available Standards?
AFP Automated Function Point
AEP
AEFP Automated Enhancement Function Point AETP Automated Enhancement Technical Point
Automated Enhancement Point
ASCRM
ASCSM
ASCPEM
ASCMM
Automated Source Code Reliability Measure
Automated Source Code Security Measure
Automated Source Code Performance Efficiency Measure
Automated Source Code Maintainability Measure
CAST Confidential 9
When to Use those Standards?
AEP
AEFP
Automated Enhancement Function Point
AETP
Automated Enhancement Technical Point
Automated Enhancement Point
AFP
Automated Function Point
AFP
Automated Function Point
Workload Monitoring
Evaluation Before Evaluation After
ASCRM
ASCSM
ASCPEM
ASCMM
ASCRM
ASCSM
ASCPEM
ASCMM
ASCRM
ASCSM
ASCPEM
ASCMM
Added
Removed
CAST Confidential 10
Size – Automated Function Points (AFP)
AFP
Automated Function Point
1
2
Measure	
  the	
  number	
  of	
  transac8ons	
  
managed	
  by	
  the	
  applica8on	
  in	
  order	
  to	
  
measure	
  the	
  amount	
  of	
  func*onality	
  
Automated	
  Func8on	
  Points	
  is	
  a	
  
technology	
  agnos*c	
  metric.	
  
CAST Confidential
Packaging & Delivery Analysis & Calibration AFP Result
AFP Calculation
Applica8on	
  	
  
source	
  code	
  
Client
11
CAST Confidential 12
AFP Calculation – focus on a Transaction
(Abstract representation of software implementation)
CAST Confidential
Automated Function Points
Automated Enhancement
Function Points
13
Activity – Automated Enhancement Points (AEP)
+ 25 AFP
1 to 2. Added new
functionality: increased the
function point count
2 to 3. Removed
localization features:
reduced the function
point count
3 to 4. Modified existing
functionality: no net
change in function point
count
4 to 5. Added new
functionality: increased the
function point count
X	
  EFP	
   Y	
  EFP	
   Z	
  EFP	
   X’	
  EFP	
  
1 2 3 4 5
AFP
Automated
Functional Point
Version	
  A	
  
1,915	
  AFP	
  
Version	
  B	
  
1,940	
  AFP	
  
AFP
Automated
Functional Point
AFP AEFP
•  Measure	
  the	
  number	
  
of	
  transac8ons	
  
managed	
  by	
  the	
  
applica8on	
  in	
  order	
  to	
  
measure	
  the	
  amount	
  
of	
  func*onality	
  
•  Automated	
  Func8on	
  
Points	
  is	
  a	
  technology	
  
agnos*c	
  metric,	
  and	
  
independent	
  of	
  the	
  
complexity	
  and	
  
quality	
  of	
  an	
  
applica8on.	
  
•  Best	
  used	
  for	
  overall	
  
func8onal	
  sizing	
  of	
  an	
  
applica8on	
  
(Used	
  on	
  Run	
  the	
  
Business)	
  
•  Enhanced	
  Func8on	
  
Points	
  is	
  a	
  func8onal	
  
sizing	
  unit	
  that	
  
measures	
  applica*on	
  
enhancements	
  and	
  
maintenance	
  ac8vi8es	
  
•  Measures	
  the	
  
func8onal	
  sizes	
  of	
  
modifica*ons	
  (added,	
  
updated,	
  deleted)	
  
between	
  two	
  releases	
  
of	
  an	
  applica8on	
  
•  Best	
  used	
  to	
  show	
  
func8onal	
  size	
  of	
  
changes	
  (Add/Delete/
Update)	
  in	
  releases	
  
(Used	
  on	
  Change	
  the	
  
Business)	
  
Functional Framework development
Optimization (cache
mechanism)
Administration tasks
Technical Debt
Reduction
1212	
  AEFP	
   144	
  AETP	
   1356	
  AEP	
  + =
AEP
AEFP
Automated Enhancement
Functional Point AETPAutomated Enhancement
Technical Point
Automated Enhancement
Point
Automated Enhancement Point
CAST Confidential
Packaging & Delivery Analysis & Calibration AFP Result
14
AEP Calculation
Applica8on	
  	
  
source	
  code	
  
New	
  version	
  
Client
CAST Confidential 15
AEP Calculation – “non-AFP” code
Application
Functional Artifacts
AFP Automated Function Point
Technical Artifacts
•  Every code elements within software boundaries
•  Not supporting AFP implementation
•  But supporting software functioning
•  Maintained
•  Evolved
100%
CAST Confidential 16
AEP Calculation – Implementation Points
Each Artifact is assigned an Effort Complexity (EC), leading to Implementation Points (IP) when evolved
Algorithm complexity
Thresholds • simple
• medium
• complex
• very complex
Cyclomatic complexity (count program and
control decision statements)
SQL complexity
Thresholds • simple
• medium
• complex
• very complex
Raw SQL Complexity (based on # of tables, #
of Subqueries, # of FROM Clauses and other
GROUP BY per query)
Coupling (Fan in, Fan out)
Thresholds • simple
• medium
• complex
• very complex
Number of Links per components from or to the
component measured
Ratio of documentation
Thresholds
• simple
• medium
• lack of comments
• Not documented
( # of lines of comments - # of bad
comments) / # of line of code
Size of components
Thresholds • small
• medium
• large
• very Large
# of lines of code
Complexity
measurement
Checksum
Checksum of the element, used to check if the
component has been modified
Used Object by a transaction
Complexity measurement in Vn-1
Complexity processed
Complexity in Vn – Complexity Vn-1
Effort Complexity variation
‘Belong to’ information, used to check which transaction
will be view s modified if the checksum changed
CAST Confidential 17
AEP Calculation – using IP to compute AEFP
AEFP
Automated Enhancement Functional Point
Application
Functional Artifacts100%
91%
𝟓𝟗𝟖  AEFP
•  Complexity Ratio*
•  Reuse Ratio
Evolved
Artifacts IP
Evolved
Transactions •  Functional Complexity
Technical
Artifacts
Functional transactions * Complexity Ratio include complexity injected and processed.
CAST Confidential 18
AEP Calculation – using IP to compute AETP
Artifacts IP
AFP
Equivalent Ratio
Application
Functional Artifacts100%
Equivalent Ratio (ER)
9%
𝟔𝟏AETPAETP
AETP
Automated Enhancement Technical Point
Technical
Artifacts
Evolved technical Artifacts IP
•  estimate how many AFP could have been added with the same implementation effort
CAST Confidential 19
AEP Calculation – AEFP vs. AETP
New Code
Complexity
Factor
Reuse
Factor
AEFP
Automated Enhancement Functional Point
AETP
Automated Enhancement Technical Point
Enhancement on components
part of Functional Transaction
Enhancement on components
not part of Functional
Transaction
Functional Releases
Technical ReleaseMigration Release
CAST Confidential
ASCRM
ASCSM
ASCPEM
ASCMM
1Measure	
  the	
  number	
  of	
  occurrences	
  of	
  
severe	
  quality	
  issues	
  
Par8cular	
  focus	
  on	
  system-­‐level	
  
paFerns.	
  2
Quality – Automated Source Code * Measures
“System-level coding violations
lead to 90% of production outages.”
OVUM RESEARCH 2014
“Tracking programming practices at the Unit Level
alone may not translate into the anticipated
business impact, […] most devastating defects can only be detected
at the System Level.” 20
CAST Confidential
Structural & System Level Risks – Security
Compliance to secured architecture
21
CAST Confidential
Structural & System Level Risks – Reliability
Compliance to vetted architecture
22
CAST Confidential
Structural & System Level Risks – Security
User input validation against injection threats
23
CAST Confidential 24
Structural & System Level Risks – Efficiency
Very large SQL table access with no suitable index
CAST Confidential 25
Automated Function Points – available metrics
For the whole software
•  Total AFP
•  Transactional AFP
•  Data AFP
For each Data AFP
•  # of DET/RET, complexity level, EIF/ILF
For each Transactional AFP
•  # of DET/FTR, complexity level, EI/EO (EQ is considered as EO)
CAST Confidential 26
Automated Enhancement Points – available metrics
For the whole software
•  Total Automated Enhancement Points
•  Automated Enhancement Function Points (Added/Deleted/Updated)
•  Automated Enhancement Technical Points (Added/Deleted/Updated)
•  Implementation Points of AEFP (Added/Deleted/Updated)(Shared/Not shared)
•  Implementation Points of AETP (Added/Deleted/Updated)
•  Equivalence Ratio
For each Data AFP
•  Complexity Factor, DET (evolved)
For each Transactional AFP
•  Complexity Factor, Effort Complexity (variation/evolved/shared)
For each Artifact code elements
•  Effort Complexity
CAST Confidential 27
ASC*M – available metrics
Number of occurrences of ACSRM-xxx-yyy reliability pattern
Number of occurrences of ACSSM-xxx-yyy security pattern
Number of occurrences of ACSPEM-xxx-yyy performance efficiency pattern
Number of occurrences of ACSMM-xxx-yyy maintainability pattern
CAST Confidential 28
Agenda
Context presentation
•  Drivers for FSM-related measurement
•  Black box / white box measurement
•  Integrated measurement
Measurement solutions – OMG standards
•  Increase visibility on software size
•  Increase visibility on software-related activity
•  Increase visibility on software quality
•  List of available metrics
Effective sizing metrics
•  Objectives and definitions
•  Samples
Conclusions
CAST Confidential 29
Characteristics of Effective Sizing Metrics
Third	
  party	
  
Metrics	
  
Sizing	
  
Metrics	
  
Trending	
  
Metrics	
  
Added	
  Cri8cal	
  Viola8on	
  Density	
  	
  
Density	
  of	
  cri8cal	
  viola8on	
  
Deleted	
  cri8cal	
  viola8on	
  density	
  trending	
  
Produc8vity	
  (effort)	
  
Defect	
  density	
  (in	
  tes8ng	
  or	
  prod)	
  
1 2
3
Produc8vity	
  (cost)	
  
Maintenance	
  Cost	
  per	
  func8on	
  point	
  
Low	
  density	
  High	
  density	
  
•  Meaningful	
  to	
  developer	
  and	
  user/customer	
  
•  Defined	
  (industry	
  recognized)	
  
•  Consistent	
  (methodology)	
  
•  Easy	
  to	
  learn	
  and	
  apply	
  
•  Accurate,	
  sta8s8cally	
  based	
  
•  Available	
  when	
  needed	
  (early)	
  
•  Addresses	
  project	
  level	
  informa8on	
  needs	
  
CAST Confidential 30
Release Assessment Overview
Normalized Quality InformationNormalized Sizing Information
Normalized Complexity Information
CAST Confidential 31
Model Performance
Level 3:
Competitive Benchmarks
Level 1:
Baselining
Level 2:
Internal Benchmarks
Compare	
  with	
  a	
  baseline	
  
By	
  Reference	
  
By	
  Technology	
  
1 2
3
By	
  Industry	
  
Canned pears, 28
ounces, costs $1.35
Canned pears, 16
ounces, costs $1.00
By	
  Dev	
  Methodology	
  
By	
  Business	
  Unit	
  
By	
  Vendor	
  
By	
  Region	
  
•  Develop	
  parametric	
  models	
  that	
  u8lize	
  historical	
  data	
  
to	
  analyze	
  the	
  impact	
  of	
  selected	
  process	
  
improvements	
  
•  Provide	
  a	
  knowledge	
  base	
  for	
  improved	
  decision	
  
making	
  
•  Iden8fy	
  areas	
  of	
  high	
  impact	
  (e.g.,	
  produc8vity	
  and	
  
quality)	
  
•  Create	
  an	
  atmosphere	
  of	
  measuring	
  performance	
  
•  Opportunity	
  for	
  comparison	
  to	
  industry	
  best	
  prac8ces	
  
CAST Confidential
VS
32
Application Benchmark
Increase regression
test activity
Plan Risk reduction
program
Plan a training program
Ensure we have
correct documentation
CAST Confidential 33
Define: Density Measure
Only AIP derived data	
  
Correlation	
   Numerator	
   Denominator	
   AAD Display	
   Detail	
  
Critical Violation
Density
Critical violation AFP
Monitor outcomes of work performed
based on acceptable risk (Benchmark and
trending)
Added Critical
Violation Density
Added critical violation AEP
Feedback to new teams to encourage
learning and behavioral change; and to
track subsequent progress (trending)
Deleted Critical
Violation Density
Deleted critical
violation
AEP
Track reduction of technical debt.
Identify high performing teams.
Input to tradeoff decision-making.
(trending)
External Data combined with AIP derived data	
  
Defect density
(in testing or prod)
Defect Severity Cat
1&2
(Version to version )
AEP
Monitor outcomes with expected risk
levels.
Correlate with Critical Violation Density
metric.
Low density of violations High density of violations
CAST Confidential 34
Define: Unit Price or Unit Effort Measure
External Data combined with AIP derived data	
  
Correlation Numerator Denominator AAD Display Detail
Productivity
(effort)
AEP
Dev + Unit Test
Effort
(t1,t2)
What is the effort required to make changes?
Compare with metric for staffing purposes.
Productivity
(cost)
AEP
Dev Cost + Unit Test
Cost
(Version to Version)
What is the cost required to make changes?
Compare with metric for budgeting purposes.
Maintenance Cost per
function point
Maintenance Cost
(Version to Version)
AFP
What is the maintenance cost required to support
changes?
Target cost reduction.
At $0.04 an ounce, the second can is the better buy, it costs less
per ounce.
Canned pears, 28 ounces,
costs $1.35
Canned pears, 16 ounces,
costs $1.00
CAST Confidential
Story Points
Team Expertise
Team Experience
Complexity Process
Complexity injected
Function Points
Complexity injected
35
Choose	
  the	
  Right	
  Sizing	
  Unit	
  to	
  Calculate	
  Density	
  Informa*on	
  
Estimation and internal team correlation External team correlation and benchmark
§  Sizing measure should be as close as possible to the activity of the development team
to represent their best guest on the complexity of a story.
§  Sizing measure should enable predictability of the development team.
§  Sizing measure should enable on-going velocity measurement.
§  Sizing measure should be independent of team characteristics (expertise,
experience)
§  Sizing measure should be independent of application characteristics (technology,
complexity).
§  Sizing measure should enable benchmark across team, technology, methodology..
Standard
Scalable
Technology and team
agnostic
Story Points
300
AEP
Defect	
  Density	
  for	
  Run	
  The	
  Business	
  (RTB)	
  Defect	
  Density	
  for	
  Change	
  The	
  Business	
  (CTB)	
  
AFP
Cri8cal	
  Viola8on	
  
Maintenance	
  
Effort	
  
Maintenance	
  
Cost	
  
Number of defects introduced in the Latest Release
Number of Automated Enhancement Points in the Latest Release
Total Number of defects in the Application (External data)
Total Number of Automated Function Points in the Application
Data	
  Collec8on	
   Metric	
  Collec8on	
  
Correla8on	
  
Correla8on	
   Metrics	
  mapped	
  to	
  
business	
  outcomes	
  
Metrics	
  are	
  
volume-­‐based	
  
Metrics	
  viewed	
  in	
  context	
  of	
  
other	
  business	
  metrics	
  
All	
  metrics	
  are	
  mapped	
  to	
  
business	
  outcomes	
  
CAST Confidential
Client 1– 300 applications
At a glance
309 applications
Minimum 1 scan per month for all 300 (Production)
80 applications scheduled to be scanned weekly
12 application “module” scans on demand (usually 2 a
week for high development)
Full Quality and FP configuration, early AEP stages
Average Scan time is 30-45 min
Mix of Technologies from mainframe (Cobol, RPG, PL1)
to 3G ( C, C++) and current suites (.net, JEE)
Fully Automated (Jenkins) moving to JIRA also
Total support staff for back-office (3)
FP calibration done on 10% of the apps per year
Fully Automated Scanning and Reporting (Jenkins)
Expanding the Data Reporting and Maturity
Source Rec	
  Level Measure Cost	
  to	
  
Operate
Business	
  
Delivery
Productivity Note/	
  Remarks External	
  Data	
  Required
CAST Module Tech	
  Debt	
  Density
CAST Module Dead	
  Code
CAST Product Defect	
  removal	
  efficiency	
  % DRE= 	
  total	
  defect	
  remediated/	
  Total	
  defects	
  
found	
  before	
  release
CAST Module In	
  process	
  risk	
  (New	
  CV/	
  EFP)
Client Application Average	
  Ticket	
  Fix	
  Effort Time	
  /	
  Ticket Tickets,	
  Time
Client Application Ticket	
  Volume Include	
  all	
  customer	
  compliants	
  in	
  warranty	
  
period	
  (typically	
  30	
  days	
  from	
  release)
Tickets,	
  Time
Client Product Release	
  Throughput	
  -­‐	
  Agility Business	
  functionality	
  delivered	
  by	
  release Specs	
  products	
  vs	
  delivered	
  
(functional)
Mixed Group Effort/EFP	
  (Or	
  AEFP) Calculated	
  Effort/	
  AEFP	
  	
  or	
  Actual	
  Effort/	
  AEFP Time
Mixed Group EFP	
  Delivered/	
  100	
  Worked	
  Hours Time
Mixed Product Defect	
  per	
  100	
  Resource	
  Hours Time
Mixed Application Development	
  Impact Calculated	
  Effort/Tracked	
  Total	
  Effort.	
  May	
  be	
  
adjusted	
  for	
  change	
  in	
  quality.	
  Typically	
  function	
  
points	
  est	
  to	
  function	
  points	
  produced
FP	
  Estimated
Mixed Program CV	
  per	
  100	
  Resource	
  Hours Time
Mixed Application FTE/1k	
  AFP	
  maintained FTE	
  -­‐	
  running	
  average	
  of	
  team	
  size Hours
Mixed Application Cost/1k	
  AFP	
  maintained Cost
Primary	
  indicator
Secondary	
  Indicator
Future Plans
•  Q1-2018
•  Integration of model based
estimates ( Statistical models
of like projects)
•  Productivity, Cost and Velocity
at the team level
•  Q2 – 2018
•  Expansion by 100
applications as M&A
completes
•  Scan for M&A targets to
estimate workload
•  Full DevOps integration with
code drops weekly
“CAST enables speed. As the portfolio changes, the
information about that portfolio becomes the
essential decision tool”
Client Exec
36
CAST Confidential
Client 2 – 200 Vendor Managed Applications
At a glance
200 applications
Minimum 1 scan every 4 months (Production)
100 vendor applications scheduled to be scanned
monthly with SLA penalties and Incentives
2 application “module” scans on demand (usually 2 a
week for high development)
Full Quality and FP configuration, early AEP stages
Average Scan time is 1-1.3 hrs
Mix of Technologies from mainframe (Cobol, PL1) to
3G ( C, C++) and current suites (.net, JEE), early
Python Adopter
Fully Automated (Jenkins) moving to JIRA also
Total support staff for back-office (2 FTE) – 10 staff
that handle results consulting as well.
FP calibration done on 10% of the apps per year
Manual FP counts on 40 applications spending 400
effort hours (***) per application
Future Plans
•  Q4-2017
•  Unit Cost Measure rollout
(Productivity) and model based
estimation
•  AEP pilot in progress
•  Expansion of Vendor Involvement
•  Q1 – 2018
•  Module Scan process for faster return
•  JIRA flow to deliver results to
Developers faster
•  Full DevOps integration with code
drops weekly
•  Vendors on the hook to produce increasing FP/HR while not compromising quality
•  Results reviewed monthly with true-ups
•  All Vendors are on automated process ( Standard of Legitimacy rule)
37
CAST Confidential
Client 3 – 10 Applications, target 100
At a glance
10 current applications
Minimum 1 scan every Month (Production)
Rapid response scans as needed based on
production issues.
Full Quality and FP configuration, fully
automated, calibration toolkit
Average Scan time is 30 min
Mix of Technologies from mainframe (Cobol,
PL1) to 3G ( C, C++) but mostly focused on
Cobol
Fully Automated (Jenkins) moving to JIRA also
with a pull from source code control
Total support staff for back-office (2 FTE) .
FP calibration done by toolkit
Future Plans
Q4-2017
•  More applications, FASTER
•  Improve Predictable scan times
•  Tighten the integration to Eclipse
Q1 – 2018
•  Push more information to
executive dashboard including
FP counts
•  Investigate productivity using FP
or AEP depending on study
results
•  Need to push velocity without compromising customer experience or security.
•  FP/HR while not compromising quality
38
CAST Confidential 39
Agenda
Context presentation
•  Drivers for FSM-related measurement
•  Black box / white box measurement
•  Integrated measurement
Measurement solutions – OMG standards
•  Increase visibility on software size
•  Increase visibility on software-related activity
•  Increase visibility on software quality
•  List of available metrics
Effective sizing metrics
•  Objectives and definitions
•  Samples
Conclusions
CAST Confidential 40
Conclusions
CAST delivers analytics
•  Based on OMG standards (AFP/AEP/ASCRM/ASCSM/ASCPEM/ASCMM)
•  Assessing various aspects of software (size/activity/quality)
•  Encompassing both functional and non-functional requirements
To increase visibility
To support effective indicators
CAST Confidential
Thank you for attending
APPLICATION INTELLIGENCE PLATFORM
•  Approx. 2,300 apps and 3 billion LoC
•  Query by industry, technology & geo
•  CRASH Annual Report
•  CAST Research Labs
•  Custom benchmarks
•  SaaS, Cloud based
•  Source code analyzed where it
resides
•  Rapid portfolio analysis
•  Portfolio continuous monitoring
•  Software flaw detection
•  Architectural analysis and
blueprinting
•  Critical violation drill down
•  Propagation risk
•  Standards-based software
metrics
•  Automated function points
•  Trend analysis
•  Transaction risk
41

More Related Content

What's hot

Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...Nesma
 
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...Nesma
 
The challenge of IT Outsourcing
The challenge of IT OutsourcingThe challenge of IT Outsourcing
The challenge of IT OutsourcingNesma
 
Ac2017 1. bit & bites
Ac2017   1. bit & bitesAc2017   1. bit & bites
Ac2017 1. bit & bitesNesma
 
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...
Nesma autumn conference  2015 - Agile may deliver but it does not win (yet) -...Nesma autumn conference  2015 - Agile may deliver but it does not win (yet) -...
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...Nesma
 
The value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van HeeringenThe value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van HeeringenHarold van Heeringen
 
Unlocking Your Organization\'s Warranty Management Potential
Unlocking Your Organization\'s Warranty Management PotentialUnlocking Your Organization\'s Warranty Management Potential
Unlocking Your Organization\'s Warranty Management PotentialImranMasood
 
Devil&Details On Agile Contracts
Devil&Details On Agile ContractsDevil&Details On Agile Contracts
Devil&Details On Agile Contractscrsadun
 
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...Nesma
 
20182712 Camunda Meetup Berlin_Andrey Shchagin
20182712 Camunda Meetup Berlin_Andrey Shchagin20182712 Camunda Meetup Berlin_Andrey Shchagin
20182712 Camunda Meetup Berlin_Andrey Shchagincamunda services GmbH
 
Dimensional planning (Devoxx 2009)
Dimensional planning (Devoxx 2009)Dimensional planning (Devoxx 2009)
Dimensional planning (Devoxx 2009)inxin
 
Colin Robb - SOA - Agile or Fragile?
Colin Robb - SOA - Agile or Fragile?Colin Robb - SOA - Agile or Fragile?
Colin Robb - SOA - Agile or Fragile?TEST Huddle
 
Making the Business Case for Remote Service Capabilities
Making the Business Case for Remote Service CapabilitiesMaking the Business Case for Remote Service Capabilities
Making the Business Case for Remote Service CapabilitiesPTC
 
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)
Iwsm2014   importance of benchmarking (john ogilvie & harold van heeringen)Iwsm2014   importance of benchmarking (john ogilvie & harold van heeringen)
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)Nesma
 
ASQ CSSBB Affidavit Example
ASQ CSSBB Affidavit ExampleASQ CSSBB Affidavit Example
ASQ CSSBB Affidavit ExampleGovind Ramu
 
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...Establishing a Project Controls Function at the UK Defence Equipment & Suppor...
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...Project Controls Expo
 
Agile Roi
Agile RoiAgile Roi
Agile Roicrsadun
 
Setting up Performance Testing & Engineering COE - Top 10 success secrets
Setting up Performance Testing & Engineering COE - Top 10 success secretsSetting up Performance Testing & Engineering COE - Top 10 success secrets
Setting up Performance Testing & Engineering COE - Top 10 success secretsRamya Ramalinga Moorthy
 
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...Benjamin Reese
 
Six Sigma Blackbelt Certification Project
Six Sigma Blackbelt Certification ProjectSix Sigma Blackbelt Certification Project
Six Sigma Blackbelt Certification Projectmrt77
 

What's hot (20)

Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
 
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...
Nesma autumn conference - Outsourcing, Agile, Function Points - Alex van den ...
 
The challenge of IT Outsourcing
The challenge of IT OutsourcingThe challenge of IT Outsourcing
The challenge of IT Outsourcing
 
Ac2017 1. bit & bites
Ac2017   1. bit & bitesAc2017   1. bit & bites
Ac2017 1. bit & bites
 
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...
Nesma autumn conference  2015 - Agile may deliver but it does not win (yet) -...Nesma autumn conference  2015 - Agile may deliver but it does not win (yet) -...
Nesma autumn conference 2015 - Agile may deliver but it does not win (yet) -...
 
The value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van HeeringenThe value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van Heeringen
 
Unlocking Your Organization\'s Warranty Management Potential
Unlocking Your Organization\'s Warranty Management PotentialUnlocking Your Organization\'s Warranty Management Potential
Unlocking Your Organization\'s Warranty Management Potential
 
Devil&Details On Agile Contracts
Devil&Details On Agile ContractsDevil&Details On Agile Contracts
Devil&Details On Agile Contracts
 
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...
Nesma autumn conference 2015 - Bye bye productivity, hello Business Value - F...
 
20182712 Camunda Meetup Berlin_Andrey Shchagin
20182712 Camunda Meetup Berlin_Andrey Shchagin20182712 Camunda Meetup Berlin_Andrey Shchagin
20182712 Camunda Meetup Berlin_Andrey Shchagin
 
Dimensional planning (Devoxx 2009)
Dimensional planning (Devoxx 2009)Dimensional planning (Devoxx 2009)
Dimensional planning (Devoxx 2009)
 
Colin Robb - SOA - Agile or Fragile?
Colin Robb - SOA - Agile or Fragile?Colin Robb - SOA - Agile or Fragile?
Colin Robb - SOA - Agile or Fragile?
 
Making the Business Case for Remote Service Capabilities
Making the Business Case for Remote Service CapabilitiesMaking the Business Case for Remote Service Capabilities
Making the Business Case for Remote Service Capabilities
 
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)
Iwsm2014   importance of benchmarking (john ogilvie & harold van heeringen)Iwsm2014   importance of benchmarking (john ogilvie & harold van heeringen)
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)
 
ASQ CSSBB Affidavit Example
ASQ CSSBB Affidavit ExampleASQ CSSBB Affidavit Example
ASQ CSSBB Affidavit Example
 
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...Establishing a Project Controls Function at the UK Defence Equipment & Suppor...
Establishing a Project Controls Function at the UK Defence Equipment & Suppor...
 
Agile Roi
Agile RoiAgile Roi
Agile Roi
 
Setting up Performance Testing & Engineering COE - Top 10 success secrets
Setting up Performance Testing & Engineering COE - Top 10 success secretsSetting up Performance Testing & Engineering COE - Top 10 success secrets
Setting up Performance Testing & Engineering COE - Top 10 success secrets
 
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...
Webinar Presentation: QDM Supplier Quality Assurance, Communicate Quickly and...
 
Six Sigma Blackbelt Certification Project
Six Sigma Blackbelt Certification ProjectSix Sigma Blackbelt Certification Project
Six Sigma Blackbelt Certification Project
 

Similar to Ac2017 3. cast software-metricsincontracts

Using Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementUsing Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementQuantitative Software Management, Inc.
 
Nandyal_20150513_1145_1230.original.1431384030
Nandyal_20150513_1145_1230.original.1431384030Nandyal_20150513_1145_1230.original.1431384030
Nandyal_20150513_1145_1230.original.1431384030Raghav Nandyal
 
2024-04 - Nesma webinar - Benchmarking.pdf
2024-04 - Nesma webinar - Benchmarking.pdf2024-04 - Nesma webinar - Benchmarking.pdf
2024-04 - Nesma webinar - Benchmarking.pdfNesma
 
software metrics(process,project,product)
software metrics(process,project,product)software metrics(process,project,product)
software metrics(process,project,product)Amisha Narsingani
 
Software Engineering Fundamentals
Software Engineering FundamentalsSoftware Engineering Fundamentals
Software Engineering FundamentalsRahul Sudame
 
Afrekenen met functiepunten
Afrekenen met functiepuntenAfrekenen met functiepunten
Afrekenen met functiepuntenNesma
 
Downloads abc 2006 presentation downloads-ramesh_babu
Downloads abc 2006   presentation downloads-ramesh_babuDownloads abc 2006   presentation downloads-ramesh_babu
Downloads abc 2006 presentation downloads-ramesh_babuHem Rana
 
Six sigma ajal
Six sigma ajalSix sigma ajal
Six sigma ajalAJAL A J
 
The value of benchmarking software projects
The value of benchmarking software projectsThe value of benchmarking software projects
The value of benchmarking software projectsHarold van Heeringen
 
Ibm test data_management_v0.4
Ibm test data_management_v0.4Ibm test data_management_v0.4
Ibm test data_management_v0.4Rosario Cunha
 
Pressman ch-22-process-and-project-metrics
Pressman ch-22-process-and-project-metricsPressman ch-22-process-and-project-metrics
Pressman ch-22-process-and-project-metricsSeema Kamble
 
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
TEST_AUTOMATION_CASE_STUDY_(2)2[1]TEST_AUTOMATION_CASE_STUDY_(2)2[1]
TEST_AUTOMATION_CASE_STUDY_(2)2[1]Clive Dall
 
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...TeamQuest Corporation
 

Similar to Ac2017 3. cast software-metricsincontracts (20)

Using Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementUsing Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process Improvement
 
Minkiewicz - Lessons Learned from the ISBSG Database
Minkiewicz - Lessons Learned from the ISBSG DatabaseMinkiewicz - Lessons Learned from the ISBSG Database
Minkiewicz - Lessons Learned from the ISBSG Database
 
Nandyal_20150513_1145_1230.original.1431384030
Nandyal_20150513_1145_1230.original.1431384030Nandyal_20150513_1145_1230.original.1431384030
Nandyal_20150513_1145_1230.original.1431384030
 
2024-04 - Nesma webinar - Benchmarking.pdf
2024-04 - Nesma webinar - Benchmarking.pdf2024-04 - Nesma webinar - Benchmarking.pdf
2024-04 - Nesma webinar - Benchmarking.pdf
 
SampleProject1
SampleProject1SampleProject1
SampleProject1
 
software metrics(process,project,product)
software metrics(process,project,product)software metrics(process,project,product)
software metrics(process,project,product)
 
Software Metrics
Software MetricsSoftware Metrics
Software Metrics
 
Software Engineering Fundamentals
Software Engineering FundamentalsSoftware Engineering Fundamentals
Software Engineering Fundamentals
 
Software engineering
Software engineeringSoftware engineering
Software engineering
 
Afrekenen met functiepunten
Afrekenen met functiepuntenAfrekenen met functiepunten
Afrekenen met functiepunten
 
Software metrics
Software metricsSoftware metrics
Software metrics
 
Downloads abc 2006 presentation downloads-ramesh_babu
Downloads abc 2006   presentation downloads-ramesh_babuDownloads abc 2006   presentation downloads-ramesh_babu
Downloads abc 2006 presentation downloads-ramesh_babu
 
Collaborative Quality Management
Collaborative Quality ManagementCollaborative Quality Management
Collaborative Quality Management
 
Six sigma ajal
Six sigma ajalSix sigma ajal
Six sigma ajal
 
The value of benchmarking software projects
The value of benchmarking software projectsThe value of benchmarking software projects
The value of benchmarking software projects
 
Ibm test data_management_v0.4
Ibm test data_management_v0.4Ibm test data_management_v0.4
Ibm test data_management_v0.4
 
Pressman ch-22-process-and-project-metrics
Pressman ch-22-process-and-project-metricsPressman ch-22-process-and-project-metrics
Pressman ch-22-process-and-project-metrics
 
Project Management
Project ManagementProject Management
Project Management
 
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
TEST_AUTOMATION_CASE_STUDY_(2)2[1]TEST_AUTOMATION_CASE_STUDY_(2)2[1]
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
 
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...
Automating IT Analytics to Optimize Service Delivery and Cost at Safeway - A ...
 

More from Nesma

Agile Team Performance Measurement webinar
Agile Team Performance Measurement webinarAgile Team Performance Measurement webinar
Agile Team Performance Measurement webinarNesma
 
Software Cost Estimation webinar January 2024.pdf
Software Cost Estimation webinar January 2024.pdfSoftware Cost Estimation webinar January 2024.pdf
Software Cost Estimation webinar January 2024.pdfNesma
 
Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...Nesma
 
Nesma event June '23 - NEN Practice Guideline - NPR.pdf
Nesma event June '23 - NEN Practice Guideline - NPR.pdfNesma event June '23 - NEN Practice Guideline - NPR.pdf
Nesma event June '23 - NEN Practice Guideline - NPR.pdfNesma
 
Nesma event June '23 - Easy Function Sizing - Introduction.pdf
Nesma event June '23 - Easy Function Sizing - Introduction.pdfNesma event June '23 - Easy Function Sizing - Introduction.pdf
Nesma event June '23 - Easy Function Sizing - Introduction.pdfNesma
 
Automotive Software Cost Estimation - The UCE Approach - Emmanuel Mary
Automotive Software Cost Estimation - The UCE Approach - Emmanuel MaryAutomotive Software Cost Estimation - The UCE Approach - Emmanuel Mary
Automotive Software Cost Estimation - The UCE Approach - Emmanuel MaryNesma
 
The COSMIC battle between David and Goliath - Paul Hussein
The COSMIC battle between David and Goliath - Paul HusseinThe COSMIC battle between David and Goliath - Paul Hussein
The COSMIC battle between David and Goliath - Paul HusseinNesma
 
Succesful Estimating - It's how you tell the story - Amritpal Singh Agar
Succesful Estimating - It's how you tell the story - Amritpal Singh AgarSuccesful Estimating - It's how you tell the story - Amritpal Singh Agar
Succesful Estimating - It's how you tell the story - Amritpal Singh AgarNesma
 
(Increasing) Predictability of large Government ICT Projects - Koos Veefkind
(Increasing) Predictability of large Government ICT Projects - Koos Veefkind(Increasing) Predictability of large Government ICT Projects - Koos Veefkind
(Increasing) Predictability of large Government ICT Projects - Koos VeefkindNesma
 
CEBoK for Software Past Present Future - Megan Jones
CEBoK for Software Past Present Future - Megan JonesCEBoK for Software Past Present Future - Megan Jones
CEBoK for Software Past Present Future - Megan JonesNesma
 
Agile Development and Agile Cost Estimation - A return to basic principles - ...
Agile Development and Agile Cost Estimation - A return to basic principles - ...Agile Development and Agile Cost Estimation - A return to basic principles - ...
Agile Development and Agile Cost Estimation - A return to basic principles - ...Nesma
 
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...Nesma
 
Project Succes is a Choice - Joop Schefferlie
Project Succes is a Choice - Joop SchefferlieProject Succes is a Choice - Joop Schefferlie
Project Succes is a Choice - Joop SchefferlieNesma
 
Agile teams get a grip - martijn groenewegen
Agile teams   get a grip - martijn groenewegenAgile teams   get a grip - martijn groenewegen
Agile teams get a grip - martijn groenewegenNesma
 
The fact that your poject is agile is not (necessarily) a cost driver arlen...
The fact that your poject is agile is not (necessarily) a cost driver   arlen...The fact that your poject is agile is not (necessarily) a cost driver   arlen...
The fact that your poject is agile is not (necessarily) a cost driver arlen...Nesma
 
Software sizing as an essential measure past present and future - Dan Galorat...
Software sizing as an essential measure past present and future - Dan Galorat...Software sizing as an essential measure past present and future - Dan Galorat...
Software sizing as an essential measure past present and future - Dan Galorat...Nesma
 
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...A benchmark based approach to determine language verbosity - Hans Kuijpers - ...
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...Nesma
 
Software sizing the cornerstone for iceaa's scebok - Carol Dekkers
Software sizing the cornerstone for iceaa's scebok - Carol DekkersSoftware sizing the cornerstone for iceaa's scebok - Carol Dekkers
Software sizing the cornerstone for iceaa's scebok - Carol DekkersNesma
 
Size matters a lot rick collins - technomics
Size matters a lot   rick collins - technomicsSize matters a lot   rick collins - technomics
Size matters a lot rick collins - technomicsNesma
 
Software estimation challenge diederik wortman - metri
Software estimation challenge   diederik wortman - metriSoftware estimation challenge   diederik wortman - metri
Software estimation challenge diederik wortman - metriNesma
 

More from Nesma (20)

Agile Team Performance Measurement webinar
Agile Team Performance Measurement webinarAgile Team Performance Measurement webinar
Agile Team Performance Measurement webinar
 
Software Cost Estimation webinar January 2024.pdf
Software Cost Estimation webinar January 2024.pdfSoftware Cost Estimation webinar January 2024.pdf
Software Cost Estimation webinar January 2024.pdf
 
Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...
 
Nesma event June '23 - NEN Practice Guideline - NPR.pdf
Nesma event June '23 - NEN Practice Guideline - NPR.pdfNesma event June '23 - NEN Practice Guideline - NPR.pdf
Nesma event June '23 - NEN Practice Guideline - NPR.pdf
 
Nesma event June '23 - Easy Function Sizing - Introduction.pdf
Nesma event June '23 - Easy Function Sizing - Introduction.pdfNesma event June '23 - Easy Function Sizing - Introduction.pdf
Nesma event June '23 - Easy Function Sizing - Introduction.pdf
 
Automotive Software Cost Estimation - The UCE Approach - Emmanuel Mary
Automotive Software Cost Estimation - The UCE Approach - Emmanuel MaryAutomotive Software Cost Estimation - The UCE Approach - Emmanuel Mary
Automotive Software Cost Estimation - The UCE Approach - Emmanuel Mary
 
The COSMIC battle between David and Goliath - Paul Hussein
The COSMIC battle between David and Goliath - Paul HusseinThe COSMIC battle between David and Goliath - Paul Hussein
The COSMIC battle between David and Goliath - Paul Hussein
 
Succesful Estimating - It's how you tell the story - Amritpal Singh Agar
Succesful Estimating - It's how you tell the story - Amritpal Singh AgarSuccesful Estimating - It's how you tell the story - Amritpal Singh Agar
Succesful Estimating - It's how you tell the story - Amritpal Singh Agar
 
(Increasing) Predictability of large Government ICT Projects - Koos Veefkind
(Increasing) Predictability of large Government ICT Projects - Koos Veefkind(Increasing) Predictability of large Government ICT Projects - Koos Veefkind
(Increasing) Predictability of large Government ICT Projects - Koos Veefkind
 
CEBoK for Software Past Present Future - Megan Jones
CEBoK for Software Past Present Future - Megan JonesCEBoK for Software Past Present Future - Megan Jones
CEBoK for Software Past Present Future - Megan Jones
 
Agile Development and Agile Cost Estimation - A return to basic principles - ...
Agile Development and Agile Cost Estimation - A return to basic principles - ...Agile Development and Agile Cost Estimation - A return to basic principles - ...
Agile Development and Agile Cost Estimation - A return to basic principles - ...
 
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...
 
Project Succes is a Choice - Joop Schefferlie
Project Succes is a Choice - Joop SchefferlieProject Succes is a Choice - Joop Schefferlie
Project Succes is a Choice - Joop Schefferlie
 
Agile teams get a grip - martijn groenewegen
Agile teams   get a grip - martijn groenewegenAgile teams   get a grip - martijn groenewegen
Agile teams get a grip - martijn groenewegen
 
The fact that your poject is agile is not (necessarily) a cost driver arlen...
The fact that your poject is agile is not (necessarily) a cost driver   arlen...The fact that your poject is agile is not (necessarily) a cost driver   arlen...
The fact that your poject is agile is not (necessarily) a cost driver arlen...
 
Software sizing as an essential measure past present and future - Dan Galorat...
Software sizing as an essential measure past present and future - Dan Galorat...Software sizing as an essential measure past present and future - Dan Galorat...
Software sizing as an essential measure past present and future - Dan Galorat...
 
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...A benchmark based approach to determine language verbosity - Hans Kuijpers - ...
A benchmark based approach to determine language verbosity - Hans Kuijpers - ...
 
Software sizing the cornerstone for iceaa's scebok - Carol Dekkers
Software sizing the cornerstone for iceaa's scebok - Carol DekkersSoftware sizing the cornerstone for iceaa's scebok - Carol Dekkers
Software sizing the cornerstone for iceaa's scebok - Carol Dekkers
 
Size matters a lot rick collins - technomics
Size matters a lot   rick collins - technomicsSize matters a lot   rick collins - technomics
Size matters a lot rick collins - technomics
 
Software estimation challenge diederik wortman - metri
Software estimation challenge   diederik wortman - metriSoftware estimation challenge   diederik wortman - metri
Software estimation challenge diederik wortman - metri
 

Recently uploaded

Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Intelisync
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdfWave PLM
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityNeo4j
 

Recently uploaded (20)

Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)
 
Exploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the ProcessExploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the Process
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered Sustainability
 

Ac2017 3. cast software-metricsincontracts

  • 1. CAST Confidential 1 CAST The Leader in Software Analytics & Risk Prevention The place of non-functional elements in measurement Nesma Autumn Conference November 9, 2017 Philippe-E. DOUZIECH Principal Research Scientist e: p.douziech@castsoftware.com
  • 2. CAST Confidential 2 Agenda Context presentation •  Drivers for FSM-related measurement •  Black box / white box measurement •  Integrated measurement Measurement solutions – OMG standards •  Increase visibility on software size •  Increase visibility on software-related activity •  Increase visibility on software quality •  List of available metrics Effective sizing metrics •  Objectives and definitions •  Samples Conclusions
  • 3. CAST Confidential Drivers for FSM-related measurement §  Benchmark measure of selected normalized metrics across a group of application. §  Trend measure of selected normalized metrics overtime and aggregate per application. §  Trend outcome / effort of a specific team (application team, country, sourcing…) §  Trend sizing, quality, complexity metrics in correlation to other metrics (cost, effort, time) §  Scorecard with breakdown per line of business, application or custom grouping (in-house or outsourced) §  Trend of the defined metrics §  Transparency, quality and risk management, and SLAs §  Baseline analysis and on-going comparisons on Agility, responsiveness and Productivity change §  Sizing, complexity and quality information processed during a release. Information display per transaction. §  Standard deviation of effort (estimation vs real vs measured) Normalization & Benchmarking Productivity Measurement & Improvement Measure Effectiveness of Transformation Initiative ADM Supplier Measurement Optimize ADM Estimation 3
  • 4. CAST Confidential Cost   Workload   Black box measurement Client •  Baseline and ongoing effort data (e.g. cost, hours, headcount) •  Baseline and on-going staffing per release •  Incident / ticket reports •  Closed pre-production defects within time commitment / Total closed pre-production defects •  Release schedule §  Benchmark measure of selected normalized metrics across a group of application. §  Trend measure of selected normalized metrics overtime and aggregate per application. §  Trend outcome / effort of a specific team (application team, country, sourcing…) §  Trend sizing, quality, complexity metrics in correlation to other metrics (cost, effort, time) §  Scorecard with breakdown per line of business, application or custom grouping (in-house or outsourced) §  Trend of the defined metrics §  Transparency, quality and risk management, and SLAs §  Baseline analysis and on-going comparisons on Agility, responsiveness and Productivity change §  Sizing, complexity and quality information processed during a release. Information display per transaction. §  Standard deviation of effort (estimation vs real vs measured) Normalization & Benchmarking Productivity Measurement & Improvement Measure Effectiveness of Transformation Initiative ADM Supplier Measurement Optimize ADM Estimation 4
  • 5. CAST Confidential Quality   Complexity   Size   Cost   Workload   Black box / white box measurement Applica8on     source  code   Client •  Measure applications within an IT portfolio to establish a baseline •  Measure on-going activity, trends of application size, complexity and quality characteristics. •  Normalize size, complexity and quality metrics •  Benchmark of normalized metrics •  Automated Function Points (AFP) and Automated Enhancement Points (AEP) •  Effort, algorithmic, SQL, object complexity •  Critical violations and violations with high weight •  Health Factor trends •  Baseline and ongoing effort data (e.g. cost, hours, headcount) •  Baseline and on-going staffing per release •  Incident / ticket reports •  Closed pre-production defects within time commitment / Total closed pre-production defects •  Release schedule §  Benchmark measure of selected normalized metrics across a group of application. §  Trend measure of selected normalized metrics overtime and aggregate per application. §  Trend outcome / effort of a specific team (application team, country, sourcing…) §  Trend sizing, quality, complexity metrics in correlation to other metrics (cost, effort, time) §  Scorecard with breakdown per line of business, application or custom grouping (in-house or outsourced) §  Trend of the defined metrics §  Transparency, quality and risk management, and SLAs §  Baseline analysis and on-going comparisons on Agility, responsiveness and Productivity change §  Sizing, complexity and quality information processed during a release. Information display per transaction. §  Standard deviation of effort (estimation vs real vs measured) Normalization & Benchmarking Productivity Measurement & Improvement Measure Effectiveness of Transformation Initiative ADM Supplier Measurement Optimize ADM Estimation 5
  • 6. CAST Confidential §  Define set of density ratios as a scorecard §  Trend set of density ratios §  Benchmark normalized metrics §  Establish distribution channel for new Metrics (CAST & internal) §  Develop rollout communication & training plan Quality   Complexity   Size   Cost   Workload   Integrated measurement Applica8on     source  code   Client Practitioner •  Measure applications within an IT portfolio to establish a baseline •  Measure on-going activity, trends of application size, complexity and quality characteristics. •  Normalize size, complexity and quality metrics •  Benchmark of normalized metrics •  Automated Function Points (AFP) and Automated Enhancement Points (AEP) •  Effort, algorithmic, SQL, object complexity •  Critical violations and violations with high weight •  Health Factor trends •  Baseline and ongoing effort data (e.g. cost, hours, headcount) •  Baseline and on-going staffing per release •  Incident / ticket reports •  Closed pre-production defects within time commitment / Total closed pre-production defects •  Release schedule §  Benchmark measure of selected normalized metrics across a group of application. §  Trend measure of selected normalized metrics overtime and aggregate per application. §  Trend outcome / effort of a specific team (application team, country, sourcing…) §  Trend sizing, quality, complexity metrics in correlation to other metrics (cost, effort, time) §  Scorecard with breakdown per line of business, application or custom grouping (in-house or outsourced) §  Trend of the defined metrics §  Transparency, quality and risk management, and SLAs §  Baseline analysis and on-going comparisons on Agility, responsiveness and Productivity change §  Sizing, complexity and quality information processed during a release. Information display per transaction. §  Standard deviation of effort (estimation vs real vs measured) Normalization & Benchmarking Productivity Measurement & Improvement Measure Effectiveness of Transformation Initiative ADM Supplier Measurement Optimize ADM Estimation 6
  • 7. CAST Confidential 7 Agenda Context presentation •  Drivers for FSM-related measurement •  Black box / white box measurement •  Integrated measurement Measurement solutions – OMG standards •  Increase visibility on software size •  Increase visibility on software-related activity •  Increase visibility on software quality •  List of available metrics Effective sizing metrics •  Objectives and definitions •  Samples Conclusions
  • 8. CAST Confidential 8 What are the Available Standards? AFP Automated Function Point AEP AEFP Automated Enhancement Function Point AETP Automated Enhancement Technical Point Automated Enhancement Point ASCRM ASCSM ASCPEM ASCMM Automated Source Code Reliability Measure Automated Source Code Security Measure Automated Source Code Performance Efficiency Measure Automated Source Code Maintainability Measure
  • 9. CAST Confidential 9 When to Use those Standards? AEP AEFP Automated Enhancement Function Point AETP Automated Enhancement Technical Point Automated Enhancement Point AFP Automated Function Point AFP Automated Function Point Workload Monitoring Evaluation Before Evaluation After ASCRM ASCSM ASCPEM ASCMM ASCRM ASCSM ASCPEM ASCMM ASCRM ASCSM ASCPEM ASCMM Added Removed
  • 10. CAST Confidential 10 Size – Automated Function Points (AFP) AFP Automated Function Point 1 2 Measure  the  number  of  transac8ons   managed  by  the  applica8on  in  order  to   measure  the  amount  of  func*onality   Automated  Func8on  Points  is  a   technology  agnos*c  metric.  
  • 11. CAST Confidential Packaging & Delivery Analysis & Calibration AFP Result AFP Calculation Applica8on     source  code   Client 11
  • 12. CAST Confidential 12 AFP Calculation – focus on a Transaction (Abstract representation of software implementation)
  • 13. CAST Confidential Automated Function Points Automated Enhancement Function Points 13 Activity – Automated Enhancement Points (AEP) + 25 AFP 1 to 2. Added new functionality: increased the function point count 2 to 3. Removed localization features: reduced the function point count 3 to 4. Modified existing functionality: no net change in function point count 4 to 5. Added new functionality: increased the function point count X  EFP   Y  EFP   Z  EFP   X’  EFP   1 2 3 4 5 AFP Automated Functional Point Version  A   1,915  AFP   Version  B   1,940  AFP   AFP Automated Functional Point AFP AEFP •  Measure  the  number   of  transac8ons   managed  by  the   applica8on  in  order  to   measure  the  amount   of  func*onality   •  Automated  Func8on   Points  is  a  technology   agnos*c  metric,  and   independent  of  the   complexity  and   quality  of  an   applica8on.   •  Best  used  for  overall   func8onal  sizing  of  an   applica8on   (Used  on  Run  the   Business)   •  Enhanced  Func8on   Points  is  a  func8onal   sizing  unit  that   measures  applica*on   enhancements  and   maintenance  ac8vi8es   •  Measures  the   func8onal  sizes  of   modifica*ons  (added,   updated,  deleted)   between  two  releases   of  an  applica8on   •  Best  used  to  show   func8onal  size  of   changes  (Add/Delete/ Update)  in  releases   (Used  on  Change  the   Business)   Functional Framework development Optimization (cache mechanism) Administration tasks Technical Debt Reduction 1212  AEFP   144  AETP   1356  AEP  + = AEP AEFP Automated Enhancement Functional Point AETPAutomated Enhancement Technical Point Automated Enhancement Point Automated Enhancement Point
  • 14. CAST Confidential Packaging & Delivery Analysis & Calibration AFP Result 14 AEP Calculation Applica8on     source  code   New  version   Client
  • 15. CAST Confidential 15 AEP Calculation – “non-AFP” code Application Functional Artifacts AFP Automated Function Point Technical Artifacts •  Every code elements within software boundaries •  Not supporting AFP implementation •  But supporting software functioning •  Maintained •  Evolved 100%
  • 16. CAST Confidential 16 AEP Calculation – Implementation Points Each Artifact is assigned an Effort Complexity (EC), leading to Implementation Points (IP) when evolved Algorithm complexity Thresholds • simple • medium • complex • very complex Cyclomatic complexity (count program and control decision statements) SQL complexity Thresholds • simple • medium • complex • very complex Raw SQL Complexity (based on # of tables, # of Subqueries, # of FROM Clauses and other GROUP BY per query) Coupling (Fan in, Fan out) Thresholds • simple • medium • complex • very complex Number of Links per components from or to the component measured Ratio of documentation Thresholds • simple • medium • lack of comments • Not documented ( # of lines of comments - # of bad comments) / # of line of code Size of components Thresholds • small • medium • large • very Large # of lines of code Complexity measurement Checksum Checksum of the element, used to check if the component has been modified Used Object by a transaction Complexity measurement in Vn-1 Complexity processed Complexity in Vn – Complexity Vn-1 Effort Complexity variation ‘Belong to’ information, used to check which transaction will be view s modified if the checksum changed
  • 17. CAST Confidential 17 AEP Calculation – using IP to compute AEFP AEFP Automated Enhancement Functional Point Application Functional Artifacts100% 91% 𝟓𝟗𝟖  AEFP •  Complexity Ratio* •  Reuse Ratio Evolved Artifacts IP Evolved Transactions •  Functional Complexity Technical Artifacts Functional transactions * Complexity Ratio include complexity injected and processed.
  • 18. CAST Confidential 18 AEP Calculation – using IP to compute AETP Artifacts IP AFP Equivalent Ratio Application Functional Artifacts100% Equivalent Ratio (ER) 9% 𝟔𝟏AETPAETP AETP Automated Enhancement Technical Point Technical Artifacts Evolved technical Artifacts IP •  estimate how many AFP could have been added with the same implementation effort
  • 19. CAST Confidential 19 AEP Calculation – AEFP vs. AETP New Code Complexity Factor Reuse Factor AEFP Automated Enhancement Functional Point AETP Automated Enhancement Technical Point Enhancement on components part of Functional Transaction Enhancement on components not part of Functional Transaction Functional Releases Technical ReleaseMigration Release
  • 20. CAST Confidential ASCRM ASCSM ASCPEM ASCMM 1Measure  the  number  of  occurrences  of   severe  quality  issues   Par8cular  focus  on  system-­‐level   paFerns.  2 Quality – Automated Source Code * Measures “System-level coding violations lead to 90% of production outages.” OVUM RESEARCH 2014 “Tracking programming practices at the Unit Level alone may not translate into the anticipated business impact, […] most devastating defects can only be detected at the System Level.” 20
  • 21. CAST Confidential Structural & System Level Risks – Security Compliance to secured architecture 21
  • 22. CAST Confidential Structural & System Level Risks – Reliability Compliance to vetted architecture 22
  • 23. CAST Confidential Structural & System Level Risks – Security User input validation against injection threats 23
  • 24. CAST Confidential 24 Structural & System Level Risks – Efficiency Very large SQL table access with no suitable index
  • 25. CAST Confidential 25 Automated Function Points – available metrics For the whole software •  Total AFP •  Transactional AFP •  Data AFP For each Data AFP •  # of DET/RET, complexity level, EIF/ILF For each Transactional AFP •  # of DET/FTR, complexity level, EI/EO (EQ is considered as EO)
  • 26. CAST Confidential 26 Automated Enhancement Points – available metrics For the whole software •  Total Automated Enhancement Points •  Automated Enhancement Function Points (Added/Deleted/Updated) •  Automated Enhancement Technical Points (Added/Deleted/Updated) •  Implementation Points of AEFP (Added/Deleted/Updated)(Shared/Not shared) •  Implementation Points of AETP (Added/Deleted/Updated) •  Equivalence Ratio For each Data AFP •  Complexity Factor, DET (evolved) For each Transactional AFP •  Complexity Factor, Effort Complexity (variation/evolved/shared) For each Artifact code elements •  Effort Complexity
  • 27. CAST Confidential 27 ASC*M – available metrics Number of occurrences of ACSRM-xxx-yyy reliability pattern Number of occurrences of ACSSM-xxx-yyy security pattern Number of occurrences of ACSPEM-xxx-yyy performance efficiency pattern Number of occurrences of ACSMM-xxx-yyy maintainability pattern
  • 28. CAST Confidential 28 Agenda Context presentation •  Drivers for FSM-related measurement •  Black box / white box measurement •  Integrated measurement Measurement solutions – OMG standards •  Increase visibility on software size •  Increase visibility on software-related activity •  Increase visibility on software quality •  List of available metrics Effective sizing metrics •  Objectives and definitions •  Samples Conclusions
  • 29. CAST Confidential 29 Characteristics of Effective Sizing Metrics Third  party   Metrics   Sizing   Metrics   Trending   Metrics   Added  Cri8cal  Viola8on  Density     Density  of  cri8cal  viola8on   Deleted  cri8cal  viola8on  density  trending   Produc8vity  (effort)   Defect  density  (in  tes8ng  or  prod)   1 2 3 Produc8vity  (cost)   Maintenance  Cost  per  func8on  point   Low  density  High  density   •  Meaningful  to  developer  and  user/customer   •  Defined  (industry  recognized)   •  Consistent  (methodology)   •  Easy  to  learn  and  apply   •  Accurate,  sta8s8cally  based   •  Available  when  needed  (early)   •  Addresses  project  level  informa8on  needs  
  • 30. CAST Confidential 30 Release Assessment Overview Normalized Quality InformationNormalized Sizing Information Normalized Complexity Information
  • 31. CAST Confidential 31 Model Performance Level 3: Competitive Benchmarks Level 1: Baselining Level 2: Internal Benchmarks Compare  with  a  baseline   By  Reference   By  Technology   1 2 3 By  Industry   Canned pears, 28 ounces, costs $1.35 Canned pears, 16 ounces, costs $1.00 By  Dev  Methodology   By  Business  Unit   By  Vendor   By  Region   •  Develop  parametric  models  that  u8lize  historical  data   to  analyze  the  impact  of  selected  process   improvements   •  Provide  a  knowledge  base  for  improved  decision   making   •  Iden8fy  areas  of  high  impact  (e.g.,  produc8vity  and   quality)   •  Create  an  atmosphere  of  measuring  performance   •  Opportunity  for  comparison  to  industry  best  prac8ces  
  • 32. CAST Confidential VS 32 Application Benchmark Increase regression test activity Plan Risk reduction program Plan a training program Ensure we have correct documentation
  • 33. CAST Confidential 33 Define: Density Measure Only AIP derived data   Correlation   Numerator   Denominator   AAD Display   Detail   Critical Violation Density Critical violation AFP Monitor outcomes of work performed based on acceptable risk (Benchmark and trending) Added Critical Violation Density Added critical violation AEP Feedback to new teams to encourage learning and behavioral change; and to track subsequent progress (trending) Deleted Critical Violation Density Deleted critical violation AEP Track reduction of technical debt. Identify high performing teams. Input to tradeoff decision-making. (trending) External Data combined with AIP derived data   Defect density (in testing or prod) Defect Severity Cat 1&2 (Version to version ) AEP Monitor outcomes with expected risk levels. Correlate with Critical Violation Density metric. Low density of violations High density of violations
  • 34. CAST Confidential 34 Define: Unit Price or Unit Effort Measure External Data combined with AIP derived data   Correlation Numerator Denominator AAD Display Detail Productivity (effort) AEP Dev + Unit Test Effort (t1,t2) What is the effort required to make changes? Compare with metric for staffing purposes. Productivity (cost) AEP Dev Cost + Unit Test Cost (Version to Version) What is the cost required to make changes? Compare with metric for budgeting purposes. Maintenance Cost per function point Maintenance Cost (Version to Version) AFP What is the maintenance cost required to support changes? Target cost reduction. At $0.04 an ounce, the second can is the better buy, it costs less per ounce. Canned pears, 28 ounces, costs $1.35 Canned pears, 16 ounces, costs $1.00
  • 35. CAST Confidential Story Points Team Expertise Team Experience Complexity Process Complexity injected Function Points Complexity injected 35 Choose  the  Right  Sizing  Unit  to  Calculate  Density  Informa*on   Estimation and internal team correlation External team correlation and benchmark §  Sizing measure should be as close as possible to the activity of the development team to represent their best guest on the complexity of a story. §  Sizing measure should enable predictability of the development team. §  Sizing measure should enable on-going velocity measurement. §  Sizing measure should be independent of team characteristics (expertise, experience) §  Sizing measure should be independent of application characteristics (technology, complexity). §  Sizing measure should enable benchmark across team, technology, methodology.. Standard Scalable Technology and team agnostic Story Points 300 AEP Defect  Density  for  Run  The  Business  (RTB)  Defect  Density  for  Change  The  Business  (CTB)   AFP Cri8cal  Viola8on   Maintenance   Effort   Maintenance   Cost   Number of defects introduced in the Latest Release Number of Automated Enhancement Points in the Latest Release Total Number of defects in the Application (External data) Total Number of Automated Function Points in the Application Data  Collec8on   Metric  Collec8on   Correla8on   Correla8on   Metrics  mapped  to   business  outcomes   Metrics  are   volume-­‐based   Metrics  viewed  in  context  of   other  business  metrics   All  metrics  are  mapped  to   business  outcomes  
  • 36. CAST Confidential Client 1– 300 applications At a glance 309 applications Minimum 1 scan per month for all 300 (Production) 80 applications scheduled to be scanned weekly 12 application “module” scans on demand (usually 2 a week for high development) Full Quality and FP configuration, early AEP stages Average Scan time is 30-45 min Mix of Technologies from mainframe (Cobol, RPG, PL1) to 3G ( C, C++) and current suites (.net, JEE) Fully Automated (Jenkins) moving to JIRA also Total support staff for back-office (3) FP calibration done on 10% of the apps per year Fully Automated Scanning and Reporting (Jenkins) Expanding the Data Reporting and Maturity Source Rec  Level Measure Cost  to   Operate Business   Delivery Productivity Note/  Remarks External  Data  Required CAST Module Tech  Debt  Density CAST Module Dead  Code CAST Product Defect  removal  efficiency  % DRE=  total  defect  remediated/  Total  defects   found  before  release CAST Module In  process  risk  (New  CV/  EFP) Client Application Average  Ticket  Fix  Effort Time  /  Ticket Tickets,  Time Client Application Ticket  Volume Include  all  customer  compliants  in  warranty   period  (typically  30  days  from  release) Tickets,  Time Client Product Release  Throughput  -­‐  Agility Business  functionality  delivered  by  release Specs  products  vs  delivered   (functional) Mixed Group Effort/EFP  (Or  AEFP) Calculated  Effort/  AEFP    or  Actual  Effort/  AEFP Time Mixed Group EFP  Delivered/  100  Worked  Hours Time Mixed Product Defect  per  100  Resource  Hours Time Mixed Application Development  Impact Calculated  Effort/Tracked  Total  Effort.  May  be   adjusted  for  change  in  quality.  Typically  function   points  est  to  function  points  produced FP  Estimated Mixed Program CV  per  100  Resource  Hours Time Mixed Application FTE/1k  AFP  maintained FTE  -­‐  running  average  of  team  size Hours Mixed Application Cost/1k  AFP  maintained Cost Primary  indicator Secondary  Indicator Future Plans •  Q1-2018 •  Integration of model based estimates ( Statistical models of like projects) •  Productivity, Cost and Velocity at the team level •  Q2 – 2018 •  Expansion by 100 applications as M&A completes •  Scan for M&A targets to estimate workload •  Full DevOps integration with code drops weekly “CAST enables speed. As the portfolio changes, the information about that portfolio becomes the essential decision tool” Client Exec 36
  • 37. CAST Confidential Client 2 – 200 Vendor Managed Applications At a glance 200 applications Minimum 1 scan every 4 months (Production) 100 vendor applications scheduled to be scanned monthly with SLA penalties and Incentives 2 application “module” scans on demand (usually 2 a week for high development) Full Quality and FP configuration, early AEP stages Average Scan time is 1-1.3 hrs Mix of Technologies from mainframe (Cobol, PL1) to 3G ( C, C++) and current suites (.net, JEE), early Python Adopter Fully Automated (Jenkins) moving to JIRA also Total support staff for back-office (2 FTE) – 10 staff that handle results consulting as well. FP calibration done on 10% of the apps per year Manual FP counts on 40 applications spending 400 effort hours (***) per application Future Plans •  Q4-2017 •  Unit Cost Measure rollout (Productivity) and model based estimation •  AEP pilot in progress •  Expansion of Vendor Involvement •  Q1 – 2018 •  Module Scan process for faster return •  JIRA flow to deliver results to Developers faster •  Full DevOps integration with code drops weekly •  Vendors on the hook to produce increasing FP/HR while not compromising quality •  Results reviewed monthly with true-ups •  All Vendors are on automated process ( Standard of Legitimacy rule) 37
  • 38. CAST Confidential Client 3 – 10 Applications, target 100 At a glance 10 current applications Minimum 1 scan every Month (Production) Rapid response scans as needed based on production issues. Full Quality and FP configuration, fully automated, calibration toolkit Average Scan time is 30 min Mix of Technologies from mainframe (Cobol, PL1) to 3G ( C, C++) but mostly focused on Cobol Fully Automated (Jenkins) moving to JIRA also with a pull from source code control Total support staff for back-office (2 FTE) . FP calibration done by toolkit Future Plans Q4-2017 •  More applications, FASTER •  Improve Predictable scan times •  Tighten the integration to Eclipse Q1 – 2018 •  Push more information to executive dashboard including FP counts •  Investigate productivity using FP or AEP depending on study results •  Need to push velocity without compromising customer experience or security. •  FP/HR while not compromising quality 38
  • 39. CAST Confidential 39 Agenda Context presentation •  Drivers for FSM-related measurement •  Black box / white box measurement •  Integrated measurement Measurement solutions – OMG standards •  Increase visibility on software size •  Increase visibility on software-related activity •  Increase visibility on software quality •  List of available metrics Effective sizing metrics •  Objectives and definitions •  Samples Conclusions
  • 40. CAST Confidential 40 Conclusions CAST delivers analytics •  Based on OMG standards (AFP/AEP/ASCRM/ASCSM/ASCPEM/ASCMM) •  Assessing various aspects of software (size/activity/quality) •  Encompassing both functional and non-functional requirements To increase visibility To support effective indicators
  • 41. CAST Confidential Thank you for attending APPLICATION INTELLIGENCE PLATFORM •  Approx. 2,300 apps and 3 billion LoC •  Query by industry, technology & geo •  CRASH Annual Report •  CAST Research Labs •  Custom benchmarks •  SaaS, Cloud based •  Source code analyzed where it resides •  Rapid portfolio analysis •  Portfolio continuous monitoring •  Software flaw detection •  Architectural analysis and blueprinting •  Critical violation drill down •  Propagation risk •  Standards-based software metrics •  Automated function points •  Trend analysis •  Transaction risk 41