Deze presentatie beschrijft een praktische implementatie van het gebruik van Nesma functiepunten in Agile deliveries. Deze presentatie is gepresenteerd door Richard Sweer van Infinity tijdens de webinar Afrekenen met functiepunten. Voor meer info: www.nesma.org; conference@nesma.org.
The beginning of a checklist version of the CMMI guidelines. If you would like the original Excel version let me know, and let SlideShare know they need to support Excel files.
Process Improvement: Process and product quality, Process Classification, Process Measurement, Process Analysis and Modeling, Process Change, The CMMI Process Improvement Framework.
Service Oriented Software Engineering: Services as reusable components, Service Engineering, Software Development with Services.
The beginning of a checklist version of the CMMI guidelines. If you would like the original Excel version let me know, and let SlideShare know they need to support Excel files.
Process Improvement: Process and product quality, Process Classification, Process Measurement, Process Analysis and Modeling, Process Change, The CMMI Process Improvement Framework.
Service Oriented Software Engineering: Services as reusable components, Service Engineering, Software Development with Services.
Critical steps in Determining Your Value Stream Management SolutionDevOps.com
In order to increase your delivery velocity, you must find, identify and solve the bottlenecks of delivery. Value Stream management solutions capture metrics and processes helping guide your digital transformation journey.
Join Marc Hornbeek, Principal Consultant and Jeff Keyes from Plutora where they will discuss a methodology determining a value stream management solution for your organization. It will consist of critical steps including a Review of VSM Assessments, Future-State Value Stream Mapping, Road-Mapping VSM Transformation, and more. Following these steps provide a logical and comprehensive approach to determine a value stream management solution that fits for your organization’s requirements.
What will be learned:
WHY – is following steps for determining a VSM solution important?
HOW – are VSM solutions determined?
WHAT – is the expected outcome of a Value Stream Management solution recommendation?
The Use of Functional Size in the Industry.pdfNesma
In this webinar, the emphasis is on the use of Functional Size in the Industry, and we focus on several use cases where functional size helps organizations to make impactful decisions based on objective metrics and data.
Introducing Camunda can significantly support your organization's digital transformation. We want to discuss how IT executives can systematically leverage this potential and which pitfalls should be observed.
Enterprise resource planning (ERP) is an enterprise-wide information system designed to coordinate all the resources, information, and activities needed to complete business processes such as order fulfillment or billing. ... Ideally, the data for the various business functions are integrated.
While traditional performance metrics often measure individual output or adherence to pre-defined plans, measuring performance in agile teams requires a different approach. Agile teams operate in iterative cycles, prioritizing adaptability and learning over rigid goals. So, why do organizations still measure their performance?
By using the right metrics in the right way, organizations can empower their agile teams to thrive and deliver exceptional results.
Software Cost Estimation webinar January 2024.pdfNesma
In this webinar you will learn why Software Cost Estimation is important, what is the Software Cost Estimation Body of Knowledge for Software and the ways you can become a professional certified software cost estimator SCEC!
The journey of UNISON Cost Engineering in the field of automotive software cost estimation started in 2018. The expectation is that in 2030 the cost of software will be 50% of the total production cost of a car. To help the OEM get a proper understanding of the software development cost they need to use some form of size measurement to compare, challenge and control the cost of software development by the software vendors.
The COSMIC battle between David and Goliath - Paul HusseinNesma
No more exhaustive and emotional discussions on price and deliverables. Predictable prices for projects and changes. No escalating maintenance costs. This can only be done by specifying exactly what you want and outsource it to the right service providers that have the required platform already in place.
Succesful Estimating - It's how you tell the story - Amritpal Singh AgarNesma
Estimating the Cost of something is a profession. But then you have to tell the story about the estimate to whoever needs to hear that story. The success of how you tell the story is determining the success of the cost estimate.
CEBoK for Software Past Present Future - Megan JonesNesma
The Cost Estimation Body of Knowledge for Software is in development for a number of years within ICEAA. First as a section of the general CEBoK, but it will be established as a separate CEBoK-S for Software, since software is becoming very prominent within the cost estimation community.
Agile Development and Agile Cost Estimation - A return to basic principles - ...Nesma
Is there a natural tension between agile development and traditional cost management or do we need to return to basic principles? Even when you are flexible, you still need to make a plan, build an estimate and measure what you have achieved.
Resolving Cost Management and Key Pitfalls of Agile Software Development - Da...Nesma
Agile software development does not always live up to the promises. Especially in the field of IT Cost Management. Without proper estimation and tracking the value cannot be made clear.
Project Succes is a Choice - Joop SchefferlieNesma
Project success is a choice. Don't stop thinking about the best way to do a project, agile or not. Select the best competencies to ensure that the project will be successful.
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...Orkestra
UIIN Conference, Madrid, 27-29 May 2024
James Wilson, Orkestra and Deusto Business School
Emily Wise, Lund University
Madeline Smith, The Glasgow School of Art
Acorn Recovery: Restore IT infra within minutesIP ServerOne
Introducing Acorn Recovery as a Service, a simple, fast, and secure managed disaster recovery (DRaaS) by IP ServerOne. A DR solution that helps restore your IT infra within minutes.
Have you ever wondered how search works while visiting an e-commerce site, internal website, or searching through other types of online resources? Look no further than this informative session on the ways that taxonomies help end-users navigate the internet! Hear from taxonomists and other information professionals who have first-hand experience creating and working with taxonomies that aid in navigation, search, and discovery across a range of disciplines.
0x01 - Newton's Third Law: Static vs. Dynamic AbusersOWASP Beja
f you offer a service on the web, odds are that someone will abuse it. Be it an API, a SaaS, a PaaS, or even a static website, someone somewhere will try to figure out a way to use it to their own needs. In this talk we'll compare measures that are effective against static attackers and how to battle a dynamic attacker who adapts to your counter-measures.
About the Speaker
===============
Diogo Sousa, Engineering Manager @ Canonical
An opinionated individual with an interest in cryptography and its intersection with secure software development.
This presentation by Morris Kleiner (University of Minnesota), was made during the discussion “Competition and Regulation in Professions and Occupations” held at the Working Party No. 2 on Competition and Regulation on 10 June 2024. More papers and presentations on the topic can be found out at oe.cd/crps.
This presentation was uploaded with the author’s consent.
Getting started with Amazon Bedrock Studio and Control Tower
Afrekenen met functiepunten
1. 1/24
oftewel een andere manier van agile team, (agile) project- en leveranciersmanagement
met een voorbeeld vanuit de praktijk …
versie 17 februari 2022
2. 2/24
Finidy – if software really matters
• Services
• Project management (IPMA-A, IPMA-B, IPMA-C and PRINCE2 Practitioner certified)
• Line management (software development)
• Metric consultant
• Scrum master (Scrum and SAFe certified)
• Function Point counting (CFPA certified)
• IT quality measurement
• IT audits and assessments
• Member of
• IPMA-NL
• NEN standards committee Software and systems engineering
• NESMA and NESMA Workgroup counting guidelines
• Experiences
• Over 30 years in IT in different roles: programmer, designer, project manager, unit manager
• Mainly custom software (projects from 500 up to and more than 400.000 hours with a
duration of couple of months until > 4 years)
• More than 100 engagements with more than 30 companies
Richard Sweer
www.finidy.nl
richard@finidy.nl
LinkedIn|RichardSweer
3. 3/24
Agenda
• Sourcing types
• Core KPIs for every (agile) projects & agile teams
• Functional Size Measurement and Agile/Scrum
• The KPIs/metrics model
• Viterra case
• Conclusion and recommendations
4. 4/24
In a ‘Time & Material contract’,
the client agrees to pay the
(internal) supplier based on the
time spent by the supplier's
employees & subcontractors to
perform the work and materials
used.
In a ‘Time & Material with KPIs
contract’, the client agrees to pay
the (internal) supplier based on the
time spent by the (internal)
supplier's employees to perform
the work and materials used, with
the condition that certain pre-
determined conditions for success
(KPIs) are met.
An ‘output-based contract’ is an
agreement between a customer and
the (internal) supplier, which creates a
relationship for the delivery of
services or products. The driving force
behind the contract is that it focuses
on what the deliverables are in
business terms rather than how they
should be delivered.
Payment is per output delivered (e.g.
Function Points). The (internal)
supplier is not paid if the quality
requirements are not met (Definition
of Done). Higher productivity means
more throughput and more profit
(win-win scenario).
With ‘outcome based’
contracting’, an agreement is
made that a (internal) supplier
or provider of services must
achieve specific goals and is paid
only when those objectives are
met.
Payment is tied to % of business
benefits (reward sharing).
Time & Material T&M with KPIs Output based Outcome based
Sourcing types – measure your deliverables (in business terms)
low Maturity of Client and (internal) Supplier, risk of (internal) Supplier and Business value high
high Risk of Client regarding quality, duration, costs and happiness of business users low
5. 5/24
Core KPIs for every (agile) projects & agile teams
• Productivity rates
Hours of effort
(functional) size of Software Product Delivered
• Cost effectiveness
Project/team dollar cost
(functional) size of Software Product Delivered
• Product quality
1) Defects delivered
(functional) size of Software Product Delivered
2) Quality attributes based on (ISO 25010 past 9126)
3) Software architectural audits
• Satisfaction of users, PO, team, …
6. 6/24
• Functional Size Measurement (FSM)
• The process of measuring Functional Size
• ISO/IEC 14143-6:2012 Functional Size Measurements
• Functional Size
• A size of the software derived from quantifying the
Functional User Requirements
• Nesma: High level counting is the ‘standard’
• In the past ‘detail’ counting was the standard
• Examples
• ISO/IEC 24570:2018 NESMA (version 2.3)
• ISO/IEC 20269:2009 IFPUG 4.3.1
• ISO/IEC 29881:2010 FiSMA (version 1.1)
• ISO/IEC 19761:2017 COSMIC FFP (version 4.0.2)
• Datawarehousing, SOA and UML (Nesma workgroups)
• Interface points (Finidy)
Functional Size Measurement - measure business functionality
measure your
Business
Functionality!!
7. 7/24
Several FSM methods applicable to define functional size
Nesma Nesma enhancement
Derived from the
Nesma
IFPUG COSMIC
Derived from the
COSMIC
CAST
Focus area
Data transaction processing
(database of file system)
Data transaction processing
(database of file system).
Other way of counting for
maintenance and
enhancements situations by
using a impact factor.
• Datawarehousing (DWH)
• Service Oriented
Architecture (SOA)
• Unified Model Language
(UML)
Data transaction
processing (database of file
system)
Data transaction processing
(database of file system) or
embedded software
Data transaction processing
(database of file system) or
embedded software
Data transaction processing (database
of file system) or embedded software
Counting base
Only counts the functional
business requirements that is
requested by the user
Only counts the functional
business requirements that is
requested by the user
Only counts the functional
business requirements that
is requested by the user
Only counts the functional
business requirements that
is requested by the user
Only counts the functional
business requirements that is
requested by the user
Only counts the functional
business requirements that is
requested by the user
Based on revised engineer of software
architecture and data model. Needs
calibration to determine what the user
requested.
ISO/Version ISO/IEC 24570 Based on ISO/IEC 24570 Based on the ISO/IEC 24570 ISO/IEC 20926 ISO/IEC 19761 Based on ISO/IEC 19761
ISO 19515:2019 and based on IFPUG
Version 4.3.1.
Latest update 2018 Version 2.3 2019 Version 2.3
DWH, 2012, version 1.2
SOA, 2013, version 1.0
UML, 2008 version 1.0
2009 Version 4.3.1 2010 Version 4.0.2
DWH, 2018, version 1.2
SOA, 2019, version 1.1.1
2019 Version 1.0
Potential usage
Portfolio management - early determination of size and investment scenarios
Risk management - realistic tenders/planning
Vendor management - vendor invoicing and monitoring
Productivity - portfolio monitoring and benchmark
Early determination not possible
Measures based on realized software
and instead of requirement definitions
Owner
Nesma, a Dutch incorporated
not for profit organization
Nesma, a Dutch
incorporated not for profit
organization
Nesma, a Dutch
incorporated not for profit
organization
IFPUG, an US non-profit,
member-governed organization
COSMIC, a Canadian
incorporated not for profit
organization
COSMIC, a Canadian
incorporated not for profit
organization
CAST; a technology corporation founded
in France.
Other
> 50 certified experts in NL
Primarily used in NL Nesma
High level counting used as
the “standard”
> 50 certified experts in NL
Primarily used in NL
High level counting used as
the “standard”
Designed in Nesma
workgroups
Less certified experts in NL
Primarily used outside the
Netherlands. Almost similar to
Nesma
+20 certified experts in NL
Complexer counting per
data-layer and data
movements
+20 certified experts in NL
Complexer counting per
data-layer and data
movements
Automated solution based on developed
software instead of business
requirements. Cast is a software quality
measurement tool. Productivity
measurement is part of this.
8. 8/24
Selecting the right function size measurement
ISO Standard
Nesma, COSMIC, IFPUG, …
Derived from ISO standard
Nesma/COSMIC DWH, SOA, UML1)
if not applicable
No ISO-standard!!
Custom made counting rules
if not applicable
Main criterium:
There must be a correlation between
the functional size and the effort to specify,
build, test and maintain the software!
Preferable
High Low
1) only Nesma
9. 9/24
Functional Size Measurement and Agile/Scrum
Indicative/High level
counting to provide
estimates for
Program Increment (PI)
Planning/portfolio
management (SAFe)
High level counting to
provide estimates for
PO and stakeholder
management
Final counting and
review on deliverables
for invoicing
Reflect on functional
size counting and
values of the metrics
Story Points for team commitment
&
Functional Size Measurement (FSM) for competitive
productivity, cost effectiveness and product quality
Part of DoR
Is High level FP counting possible?
10. 10/24
Agenda
• Sourcing types
• Core KPIs for every (agile) projects & agile teams
• Functional Size Measurement and Agile/Scrum
• The KPIs/metrics model
• Viterra case
• Conclusion and recommendations
11. 11/24
The model – criteria I used for this model
Calculation rules and guidelines of all used KPIs/metrics must be
objectively measurable, realistic, transparent, traceable to source code
(code annotation), auditable and continuously measured to create a
win-win situation between vendor and supplier
Objectively measurable
Separating facts from emotions
and keeping the discussion
focus on logic
12. 12/24
Definition of Done
(do not pay if not met) Bonus/malus Bonus/malus
The model - right balance between the 4 KPIs
+ +
Especially for Output-based contracting
Price/hour per FSM
(e.g. Function Points)
Priority 1 for customers
(based on experience)
Quality Productivity
Time to market
Satisfaction
risk at supplier
(for supplier most important)
impact
13. 13/24
The model – choose the right Functional Size Measurement
KPIs
main level
KPIs second level
Metrics
third level
(35 until 40 metrics)
Strongly related to
functional size
How to measure
1. Quality
1. Code quality metrics + 8 metrics No
mostly automatically
reporting automatically
2. Functional test coverage metrics + 2 metrics depends often 100% automatically
3. Defect (Density) metrics + 9 metrics 100%
partly manual
reporting automatically
4. MTBF metrics + 8 metrics partly
partly manual
reporting automatically
2. Satisfaction 5. Satisfaction metrics + 5 metrics No 100% manual
3. Time to market 6. Time to market and MTTR metrics + 4 metrics Mostly
partly manual
reporting automatically
4. Productivity 7. Productivity metrics + 2 metrics 100%
partly manual
reporting automatically
14. 14/24
1. Size (factor is exponential) ***
2. Who (factor 10-20)***
3. Technology (factor 2-6) ***
The model – don’t compare apples with oranges
4. Which Functional Size Measurement will be used (Nesma, COSMIC, IFPUG)
5. Type of counting (indicative, high-level and/or detail)
6. New build or maintenance and which guidelines (Nesma 2.3/2.3)
7. Project or product functional size measurement
8. Production capacity per time unit (progress time, team size) ***
9. Faults per functional size (during acceptance and production)
10. Products to be delivered
11. Quality/acceptance criteria of the products to be delivered
12. Business/process complexity (factor 30-40) ***
13. Non functional requirements (usability, performance, security, etc.)
14. Architecture (development, support and production – tools & hardware & software)
OLTP .NET OLTP Low-code
Hours per Function Points depends on 14 factors/cost drivers
DWH/BI OLTP Java
*** Size
Who
Technology
# Teams
Complexity
15. 15/24
Agenda
• Sourcing types
• Core KPIs for every (agile) projects & agile teams
• Functional Size Measurement and Agile/Scrum
• The KPIs/metrics model
• Viterra case
• Conclusion and recommendations
16. We are Viterra, bringing producers and consumers together in a world-leading, fully integrated agriculture network.
• Replatforming
of existing application
of 8.000 FP1)
• New application of 900 FP1)
• Maintenance of both
applications
• Line organization of
100+ team member
with 8 Scrum teams
• Four Scrum teams are at
Output based and four
Scrum teams are at
T&M with KPIs sourcing
level
1) ELF and ILF are not counted
17. KSF - Step 1: what’s in the price per Function Point (FP)
• Describe all the ‘system’ deliverables (the output) – based on PRINCE2 2017 template (step 1a)
• Determine what’s in the price per FP and what’s not in the price per FP (step 1b & step 1c)
• For each product and for each activity/Scrum ceremony
Activities
included in
price/FP
Activities
excluded from
price/FP
Generic
components and
accelerators
Coding and testing of
PL/SQL programs
Deployment (up to
Acceptance)
Design,
implementation or
support of
infrastructure
Handover to line
organization
Load, Stress and
Performance tests
Scrum Master and
Project
Management
Production
Deployment
Support for solving
defects
…
….
‘system’
deliverables
Included
in
price/FP
ARC-01: Project Start
Architecture
YES
ARC-02: Project
Architecture
deviations
YES
BA-01: Non-
functional
requirements
YES
BA-02: User Stories YES
BA-03: UX design NO
… …
BUILD-03: Source code
Objective Collection of files containing codes written in a specific
language (e.g. Typescript, C#, PL/SQL) that is 'translated'
into a set of instructions that can executed by an execution
environment (e.g. JavaScript Engine, .NET , Oracle).
Composition Source code is distinguished based on technology
• Frond end (typescript / HTML 5 / SCSS)
• Back end (C#)
• Database (Oracle)
Appearance
and format
IDE (Visual Studio, PL/SQL Developer).
Source code is stored in TFS git repository.
Acceptance/
quality criteria
Meet the Definition of Done
Tiobe TICS ratings:
• Abstract Interpretation A
• Cyclomatic complexity B
• Compiler warnings C no level 1
• Code standards A
• Code duplication C
• Dead Code A
…. ……
Product descriptions – ‘system’ deliverables
• ARC-01: Project Start Architecture
• BA-01: User Stories
• BA-02: Non-functional requirements
• BA-03: UX-design
• BUILD-01: Technical design
• BUILD-02: Unit test scripts
• BUILD-03: Source code
• BUILD-04: Release notes
BUILD-05: Deployment packages
• TEST-01: Product risk analysis
• TEST-02: Component (API/VIEW)
• TEST-03: End-to-End
• TEST-04: Load & performance
• TEST-05: Release advice
• ….
Step 1a Step 1b Step 1c
18. KSF - Step 2: Payments associated to clear output moments
Sprint n
Iteration n
Sprint n-1
Iteration n-1
Sprint n+1
Iteration n+1
x Consecutive
weeks in production
with no defects
Quality gate 2
DoD + reviewing the code
(architecture/technical debt)
Pay x% of the price per FP
Quality gate 1
Definition of Ready
Quality gate 3
Release accepted
Pay other x% of the
price per FP
Viterra
to review Sprint n or
Iteration n
Live in
production
Performing and
load/stress
testing by Viterra
UAT/
Exploratory
testing by Viterra
Definition of Done
19. 1. Define metrics per KPIs
2. Select a subset (combination of
priority and easy to measure).
Like happiness or static code
quality with SonarQube/TIOBE.
3. Define target and where to
measure and by whom (Excel is
good enough)
4. Start to measure (with Excel)
5. Extend your measurements
every sprint.
Better - code quality Target Where Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
TIOBE - Code Coverage B C A B C D
TIOBE - Abstract interpretations C B A D B A
TIOBE - Cyclomatic complexity E B A B F F
SonarQube - Bugs - reliability B A C C B A
SonarQube - Vulnerabilities - security C A A B D D
Better - Defect metrics Target Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
Maximum defect with severity 1 in first iteration of UAT (defects/FP) 1 UAT 1,667 1 3 3 1
Maximum defects with severity 1 and 2 in Production (defects/FP) 2 Production 1,833 2 1 2 3
Defect with severity 1 and 2 in code before production move 0 DoD 0,5 0 1 1 1
Happier - Satisfaction metrics (value between 1 and 10 where 10 is most satisfied) Target Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
Development team happiness (per sprint) 7 Sprint review 6,75 7,5 7,0 6,0 7,0
Development team collaboration (per sprint) 7 Sprint review 6,75 8,5 8,0 6,0 5,0
Product Owner satisfaction (per sprint) 7 Sprint review 7,92 7,5 7,0 8,0 8,0
Faster - through-put metrics Target Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
Amount of FP per sprint 50 DoD 48 20 30 55 60
Minimum amount of Story Points/FP in stock 60 DoD 52,5 20 40 55 60
Faster - response and resolution Target Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
Response & resolution time for defects severity 1 (during UAT in days) 2 UAT 2,0 2,0 2,0 2,0 2,0
Response & resolution time for defects severity 1 (during production in days) 1 Production 1,43 2,0 0,6 2,0 1,0
Response & resolution time for defects severity 2 (during production in days) 2 Production 1,5 2,0 3,0 1,0 1,0
Cheaper - productivity metrics (per sprint in hours/FP) Target Average Sprint 1 Sprint 2 Sprint 3 Sprint 4
Business analyst 4 DoD 4,7 6,0 5,0 5,0 3,5
Build (included unit tests) 10 DoD 8,3 7,0 8,0 10,0 8,0
Testing (view, API, E2E) 6 DoD 6,8 7,0 7,0 8,0 7,0
Total 20 DoD 19,8 20,0 20,0 23,0 18,5
DoD
Dashboard value stream/project xyz
KSF - Step 3: Start to measure - if is it too overwhelming start small!
Per SAFe Value Stream or Scrum team or Agile project
target values are fictitious values
20. Step by step to reach your goal – automate as much as possible
Agile KPIs
with underlying metrics
21. Example 1 – Fact-based modernization with the use of Function Points
22. Example 2 – Measuring Business Functionality/Value & Progress with FP
Q
Production (FP) Quality gate 2 - pass (FP) Quality gate 2 (FP) At team level (FP)
1,043 394 481 304
(46%) (18%) (22%) (14%)
FP per Sprint/Iteration
At team level
Quality gate 2
Quality gate 2 – pass
Production
23. Example 3 – Defect Density with the use of Function Points
24. 24/24
Conclusion and recommendations
• Choosing the right FSM is a key success factor
• Not only focus on hours per FSM/Function Point but also on having a balance in agility,
trust, continuity, productivity, satisfaction and quality. Quality has the highest priority!
• Create a Win-Win-situation between vendor and supplier
• Define different price/hours baskets if the ‘14 factors’ are very different in a portfolio
• KPIs/metrics
• must have a short feedback loop (minutes/hours)
• must be measured after every sprint and as much as possible automatically (test/build/deploy/release)
• must be used in retrospectives - without data you are just another person with an opinion
• Payments based on Function Points are also possible in Scrum Agile teams & projects
(output based contracting)!