Software Productivity Framework
Upcoming SlideShare
Loading in...5
×
 

Software Productivity Framework

on

  • 2,385 views

 

Statistics

Views

Total Views
2,385
Views on SlideShare
2,372
Embed Views
13

Actions

Likes
1
Downloads
78
Comments
0

1 Embed 13

http://www.slideshare.net 13

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Software Productivity Framework Software Productivity Framework Presentation Transcript

    • Productivity Assessment
      Framework
      January, 2010
    • 2
      In this module we will define and analyze the various metrics that impact productivity and identify key existing challenges
      To Identify the key productivity metrics to measure across organization
      • Setup and monitor the metrics
      • Measure key productivity gaps across teams
      Why do we need to track productivity?
      Understand the throughput from the engineering resources, based on
      • Life cycle of the product
      • Geographic location
      2
    • 3
      We will leverage our existing framework and customize it to suit Client needs
      Developing The Framework
      Data Collection
      Analysis
      Report & Optimization
      • Review Zinnov’s productivity framework
      • Identify gaps in existing metrics
      • Finalize the metrics for monitoring and measurements
      • Identify the mechanism to collect the data for metrics
      • Review existing artifacts (product and project management documents) to collect the data
      • Interview key stakeholders to collect missing information and to validate the data collected through artifacts
      • Review the data to understand the key gaps between the teams
      • Capture qualitative and quantitative insights from the analysis
      • Identify key risks impacting productivity across the organization
      • Propose risk mitigation strategies
      • Deliver a comprehensive report
      3
    • 4
      Zinnov will define the productivity drivers and metrics using experiences from other engagements and through discussions with Client (1/2)
      Communication
      • Travel between the centers
      • Number/Type of issue escalations
      • Status reports
      • Meeting effectiveness
      • Project transparency
      Human Resources
      • Brand recognition
      • Team ramp up time
      • Attrition rate
      • Capability development
      • Resource allocation
      • Interview process efficiency
      • Employee motivation
      Infrastructure
      • Internet latency
      • Commute time
      • Phone/internet connectivity from home
      • IT support
      • Lab availability
      Productivity Drivers`
      Organization Structure
      • Project ownership in India center
      • Technical ownership
      • Team alignments
      • Customer interaction
      • Level of autonomy
      Knowledge Transfer and Management
      • Time spent on knowledge transfer
      • Mode of knowledge transfer
      • Average number of reviews
      • Effectiveness of KMS
      Development Process
      • Coding standard
      • Documentation
      • CMS process
      • Release processes
      • Risk planning
      • Requirement mgmt
      • Change management
      • Project initiation
      4
    • 5
      Zinnov will define the productivity drivers and metrics using experiences from other engagements and through discussions with Client (1/2)
      Definition of Productivity Metrics for the Engineering Group
      A
      C
      B
      • % of milestone slippage
      • Estimation accuracy
      • Effort variance
      • Defect density
      • Field error rate
      • Defect removal efficiency
      • Defect removal efficiency
      • Maintenance rework
      • Response time
      • # open P1, P2, P3
      • Cost of quality
      • % of automation
      • Test case defect detection efficiency
      • Bug type distribution
      • Test cycle distribution
      • Field error rate
      DEVELPOMENT
      SUSTENENCE
      QUALITY ASSURANCE
      • Key metrics required by Client to measure productivity will be determined and appropriate metrics will be identified
      • Metrics could be based on factors such as center maturity, people maturity, product maturity or PDLC process
      5
    • 6
      In addition will baseline and finalize the metrics and drivers
      Metrics and Drivers baseline
      1
      2
      3
      4
      5
      Define baselines
      Map complexity vs. impact
      Analyze impact on productivity
      Analyze data collection complexity
      Review with stakeholders
      Productivity Drivers
      Productivity Metrics
      P
      A
      Q
      R
      H
      H
      E
      M
      I
      Complexity in Collecting Data
      Complexity in Collecting Data
      K
      K
      D
      M
      D
      M
      C
      C
      O
      F
      O
      R
      G
      B
      Q
      L
      A
      Impact
      Impact
      6
    • 7
      An excel modeler will be developed that will assist in the collection, monitoring and tracking of the drivers and metrics
      • Metrics and driver information will be inputted into the modeler
      • The modeler will have the capability
      • To compare metrics between projects
      • To plot historic trends of metrics
      • To plot relationship between the metrics and drivers
      • The modeler can also be used as a dashboard to measure the health of the projects
      7
    • 8
      The productivity metrics will be represented as a dashboard to executive management team
      Planned Vs. Actual Hrs
      Percentage Engineering Effort
      Sample Output
      Details of Bugs re-opened
      Planned Vs. Actual features
      8
    • 9
      Case Study 2: We assisted a industry leading product company in defining the productivity and quality measures and benchmarking them in their global centers
      Challenge: Productivity measurements were frowned by the client stakeholders due to the complexities involved. Executives were frustrated with the level of details and assumptions provided
      Key Challenges
      • Process and metrics should to be simple yet powerful
      • Ability to easily compare productivity across Business Units and Geographies
      - Strong framework that would minimize assumptions
      Created Objectives
      Zinnov and the client worked to propose an objective mechanism to measure the productivity of Development and Quality teams
      Defined Frameworks
      Framework was defined to measure the complexity of the product based on the architecture, feature set, functionality size, internal/external interfaces, data structure etc.
      Measured Efforts
      A simple mechanism was devised to calculate the efforts using the experience levels, size of the team, man hours for each activity etc.
      Zinnov Solution
      Output Metrics
      For the quality engineering teams also, similar complexity metrics were defined using number of bugs captured, number of bugs not captures, percentage of automation etc.
      Compared Productivity
      Using the weighted averages of the complexity and the engineering efforts, we were able to get the directional productivity comparison between teams working across the organizations
      9
    • 10
      Case Study 2: We assisted a industry leading product company in defining the productivity and quality measures and benchmarking them in their global centers
      Output: The questionnaire based framework along with the dashboard view provided key insights to the executives on the productivity and quality differences between the teams
      Deliverables
      • Framework with the guidelines for the input
      • Comprehensive questionnaire with details on scoring mechanism
      • Excel based tool to capture the complexity of the team based on the User input
      • Dash board view to objectively compare the differences between teams
      Ability to measure the directional measure productivity without getting into complex calculations
      Simple efforts from the clients to collect, measure and monitor the metrics on an ongoing basis
      Normalizing the assumption to minimize the error in measurement
      KEY BENEFITS
      Ability to measure metrics across business units, product/feature based teams and geo diversified teams
      10
    • 11
      Case Study 3: We engaged with a mid-sized product company to analyze the productivity of their product development teams
      Output: Apples to Apples comparison between global teams were not feasible due to the maturity of the organizations. Only the engineering metrics were collected which resulted in dissatisfaction across teams
      Companies Vs. Client Experience – Development team
      31
      18
      Number of Software Developers
      11
      6
      Years of Experience
      11
    • 12
      Case Study 3: We also provided recommendations to enhance productivity
      Output: On the basis of our analysis, the Client looked to resolve some constraints that impacted the productivity of the teams
      Organization cultural issues
      • There seems to be a high level of threshold for non-performance in the organization. There were not many who had been let go due to non-performance.
      • There has been no instances of successful initiation and implementation of change initiatives.
      Project model with inbuilt overheads
      • There are number of projects still in a split team model. The projects carry an inherent overhead in terms of communication and management time from the US. It is also extremely difficult for the US managers to track the tasks executed by the local teams.
      Lack of well-defined product planning process
      • Even the product managers of different products were not clear on the roadmap for those products. There were at least two scenarios where the engineering team spent months building features/add-ons that the product managers were not aware of or had consented to.
      Lack of domain exposure to India center
      • Most of the engineers have never met a customer or even seen a video of how the customer uses the products that are built from the India center. Only few engineers have been exposed to customers at trade shows.
      12
    • 13
      About Zinnov
      Zinnov helps organizations globalize their business and improve people strategy. 
      Our consulting solutions are based on rigorous research techniques, data analytics and communities
      Zinnov Research: http://www.zinnov.com/white_papers.php
      Zinnov Blog: http://zinnov.com/blog/
      Zinnov Services: http://zinnov.com/services.html
      Zinnov Events: http://www.zinnov.com/zn_events.php
      Zinnov LinkedIn: http://www.linkedin.com/companies/30724
      Zinnov Facebook: http://www.facebook.com/pages/Zinnov/111718952202627
      @zinnov
      For additional details on the topic, contact info@zinnov.com
    • Thank You
      69 "Prathiba Complex", 4th 'A' Cross, Koramangala Ind. Layout5th Block, Koramangala
      Bangalore – 560095
      Phone: +91-80-41127925/6
      575 N. Pastoria Ave
      Suite J
      Sunnyvale
      CA – 94085
      Phone: +1-408-716-8432
      21, Waterway Ave, Suite 300
      The Woodlands
      TX – 77380Phone: +1-281-362-2773
      info@zinnov.com
      www.zinnov.com
      @zinnov