• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Ch 12(spi)cm mi scampi
 

Ch 12(spi)cm mi scampi

on

  • 317 views

 

Statistics

Views

Total Views
317
Views on SlideShare
317
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Ch 12(spi)cm mi scampi Ch 12(spi)cm mi scampi Presentation Transcript

    • Kittitouch Suteeca Ref. Panit Watcharawitch, PhD (cantab)
    • Standard CMMI Appraisal Method for Process Improvement (SCAMPI)  SCAMPI is designed to provide benchmark quality ratings relative to Capability Maturity Model® Integration (CMMISM) models. It is applicable to a wide range of appraisal usage modes, including both internal process improvement and external capability determinations. SCAMPI satisfies all of the Appraisal Requirements for CMMI (ARC) requirements for a Class A appraisal method and supports the conduct of ISO/IEC 15504 assessments.  SCAMPI Appraisal System (SAS)  Method Definition Document (MDD)  Appraisal Requirements for CMMI (ARC)  Appraisal Team Member (ATM)  Appraisal Disclosure Statement (ADS) 2
    • SCAMPI class A, B, C 3
    • Objective SCAMPI class A  Understand the current implemented process.  Identify process weakness (and strengths) in the organization unit.  Determine degree of satisfaction of CMMI process area goal investigated.  Assign rating ,if request by appraisal sponsor. 4
    • Rating process attributes  Process attribute rating values: The ordinal rating scale defined below shall be used to express the levels of achievement of the process attributes.  N Not achieved: There is little or no evidence of achievement of the defined attribute in the assessed process.  P Partially achieved: There is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable. 5
    • Rating process attributes  L Largely achieved: There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process.  F Fully achieved: There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process. 6
    • Rating process attributes  The corresponding values shall be:  N Not achieved 0 to 15% achievement  P Partially achieved >15% to 50% achievement  L Largely achieved >50% to 85% achievement  F Fully achieved >85% to 100% achievement. 7
    • Example: CMMI ML2 Appraisal of RUP 8 Reference: http://www.ibm.com/developerworks/rational/library/dec05/grundmann/ Requirement management (REQM)
    • Example: CMMI ML2 Appraisal of RUP #2 9 Project planning(PP)
    • Example: CMMI ML2 Appraisal of RUP #3 10 Supplier Agreement Management (SAM)
    • Example: CMMI ML2 Appraisal of RUP #4 11 Generic goal and Generic Practice
    • Example: CMMI ML2 Appraisal 12 0.00% 20.00% 40.00% 60.00% 80.00% 100.00% 120.00% REQM PP PMC SAM MA PPQA CM PAs SPI with CMMI Project Summary Chart Maturity Level 2 % Complete of each PAs % Actual Complete of PAs
    • Example: CMMI ML2 Appraisal 13 0.00% 5.00% 10.00% 15.00% 20.00% 25.00% REQM PP PMC SAM MA PPQA CMPAs SPI with CMMI Project Summary Chart Maturity Level 2 % PAs in Project (Estimation) % Complete of PAs in Project
    • Example: CMMI ML2 Appraisal 14
    • Scoring 15
    • Generic Goals & Generic Practices GG 1 Achieve Specific Goals GP 1.1 Perform Specific Practices GG 2 Institutionalize a Managed Process GP 2.1 Establish an Organizational Policy GP 2.2 Plan and Process GP 2.3 Provide Resources GP 2.4 Assign Responsibility GP 2.4 Train People GP 2.6 Manage Configuration GP 2.7 Identify and Involve Relevant Stakeholders GP 2.8 Monitor and Control the process GP 2.9 Objectively Evaluate Adherence GP 2.10 Review Status with Higher Level Management GG 3 Institutionalize a Defined Process GP 3.1 Establish a Defined Process GP 3.2 Collect Improvement Information GG 4 Institutionalize a Quantitatively Managed Process GP 4.1 Establish Quality Objective for the Process GP 4.2 Stabilize Subprocess Performance GG 5 Institutionalize an Optimizing Process GP 5.1 Ensure Continuous Process Improvement GP 5.2 Correct Root Causes of Problem StagedRepresentation 16
    • Appraisal keys  Appraisal Considerations  Direct Artifact  Indirect Artifact 17
    • Example:REQM ML2  15 Key practices  5 Specific Practices  10 Generic Practices  SP 1.1-1 Obtain an Understanding of Requirements  Develop an understanding with the requirements providers on the meaning of the requirements  Appraisal Considerations  Consider existence of requirements at multiple levels (e.g., product and product component requirements.). 18
    • Example:REQM ML2  Direct Artifact  Allocated Requirements Document  Allocated Requirements Database  Review Report of Requirements  Commitments from affected groups on allocated requirements  Defined criteria for evaluation and acceptance of requirements 19
    • Example:REQM ML2  Indirect Artifact  Evidence of early (before committing to develop the corresponding software) system requirement allocated to software review (e.g. meeting minutes, reports…)  A list or characterization of requirements providers authorized to provide direction 20
    • Example of ML2 and ML3  Available in SPI. site. 21