BlueCross: Peer Review
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

BlueCross: Peer Review

on

  • 1,642 views

Explaining and justifying the peer review process in project management.

Explaining and justifying the peer review process in project management.

Statistics

Views

Total Views
1,642
Views on SlideShare
1,636
Embed Views
6

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 6

http://www.linkedin.com 5
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • This presentation is about: A brief introduction to SDM – Solution Delivery Methodology & basic software development. An introduction to peer reviews. Documentation made mandatory, by our management, for Prod Fix peer reviews.
  • The slides in this presentation may be detailed but the presentation will be focused on attaining a general understating.
  • A) Brief SDM Overview. When is SDM Mandatory? Anything over 80 hours. B) Please notice the reference at the bottom of the slides. Thess are here to acclimate you to the SDM documentation and can be used for individual research later.
  • A) Brief SDM Overview. When are you responsible for SDM compliance? Anything below 1 & 2 tiers. A project manager will be assigned to the project in 1 & 2 tier projects and he will be responsible for SDM. You are responsible for anything else. B) Some presentations indicate ~36K for Tier 4.
  • Activity Diagram basics: These columns are separate by swinlines that indicate boundaries between systems or processes in Activity Diagrams. Anything directly on a swimlane line is a system or process shared by the two. Purple represents reports. Light purple parallelogram represents Input and Output. Boxes represent processes. Diamonds represent selection or decision. The line with the specific arrow head represents synchronous process flow. B) From this complicated structure notice this: In the left “light blue” column the numbers represent sections in the eDART. The grey boxes major SDM stage is always followed by a review and code is always tested.
  • Behind SDM Introduction. Waterfall model – is developing software all at once throughout the Software Development Life Cycle (SDLC). The rate of requirement change leads to poor software development that does not meet customer needs. A study by Boehm and Papaccico showed that typical software project experienced a 25% change in requirements [BP88]. And this trend was corroborated in another major study of thousands of software projects, with change rates that go even higher – 35% to 50% for large projects [Jones97]. – Applying UML and Patterns 3 rd addition by Craig Larman Iterative development – is developing software in iterations (i.e. timeboxes between 1-2 to 4 weeks) that will be more adaptable to change. For more information please research the references. B) SDM takes an Iterative development approach but don’t fall into the trap of “waterfall” thinking! Feedback and adaptation are critical component to software development.
  • This statistically analysis supports the idea that requirements will change. When change is not accounted for in software development then the project’s possibility of failure increases. Analysts report that as many as 71 percent of software projects that fail do so because of poor requirements management, making it the single biggest reason for project failure—bigger than bad technology, missed deadlines or change management fiascoes. – (Reference: http://www.cio.com/article/14295/Fixing_the_Software_Requirements_Mess_) Fixing the Software Requirements Mess by Christopher Lindquist 11/15/2005.
  • This is all the SDM documentation as of April. Please notice: In eDART, individual sections are associated with all the specific SDM forms for each particular stage. Notice the highlight “red” eDART sections are associated with reviews. Here the focus is only on the logical, physical, and testing reviews. The highlight “blue” eDART sections are addition review you might be interested in. Reviews are associated with the “V&V” SDM Verification & Validation process. Notice this process is “iterative” as a whole and by individual sections through base-lining! Where you see base-lined think change possible. Remember possibly up to 50% of you requirements will change! SDM tries to mitigate this risk by base-lining but it will still happen. Risk is a concept that denotes a potential negative impact to an asset or some characteristic of value that may arise from some present process or future event. - Wikipedia webpage
  • A) You prevent defects and this will improve the quality of the product. B) The cost of identifying and fixing a defect in the early stages of the lifecycle is significantly less than identifying it later. C) This will enable knowledge management and the colleagues involved will learn from one another.
  • A) Two things are required for Prod Fixes: (1) V&V0022-TMPL-Review Summary Report Form & (2) V&V0013-TMPL-Test Case Template-SDM-0013 . Each are explained through (1) V&V0020-STD-Standard Procedures for Internal Review & (2) V&V0010-STD-Standard Procedures for Testing. Here we only introduce 1 review document for Prod Fix reviews. B) Don’t be fooled! This is just for Prod Fixes. Any other SDM process must use all the mandatory documentation set forth by the amount of rigor (the PM0025-PRC-SDM Tailoring Guideline Procedure-SDM-0025 ). The Review Summary Report ( V&V0022-TMPL-Review Summary Report Form-SDM-0022 ) and the Review Issue Log ( V&V0021-TMPL-Review Issue Log-SDM-0021 ) contain spaces to record all these data items. All open issues were recorded on the issues log ( V&V0021-TMPL-Review Issue Log-SDM-0021 ) Any uncorrected defects were logged into the project’s defect-tracking system. See the Defect Management Process ( CM0004-PRO-Project Testing Defect Mgmt Process-SDM-0004 ) In some projects it might make sense to combine the activities of Internal Reviews with the activities of Stakeholder Reviews ( V&V0025-STD-Standard Procedures for Stakeholder Review-SDM-0025 ).
  • This is only for Prod Fixes. Note if the review fails you do not need any more supplemental documentation to record the failure; just give some feedback. Verbal feedback would be sufficient. There is no need to log these errors into the project’s defect-tracking system since this is only a Prod Fix. Signoff is not a rubber stamp!
  • A) How to understand the SDM naming conventions based on process type. The “VV” category contains both reviewing and testing documentation . These are or will be located in the PAL – Process Asset Library.
  • A) How to understand the SDM naming conventions based on process categories. One process category can have multiple process types. The most common categories we will use are TMPL & STD . These are or will be located in the PAL – Process Asset Library.
  • How to find the PAL - Process Asset Library. Go to “Computer & Technology” from FYI Blue “Home” page. Then select “ITG Workbench”. Go into eDART. This is the how the SDM documents are link to the eDART either in general or by a specific eDART section. C) Here is the location of the peer review form.
  • You should get familiar with the “ Verification and Validation ” (“ VV ”) process Area. Please read the “STD” Standard Peer Review Procedures. All procedures you need to know are located in the PAL.
  • A) This is accessing the PAL but not by going through eDART.
  • A) Agile Development encourages less documentation, within “Iterative Development”, and emphasizes working software as a means of progress. B) SDM incorporates Agile development, to a certain extent, but also wants evidence of SDM governance that will mitigate project risk. As a result “tailoring” is born! The term project governance is used in industry, especially in the information technology (IT) sector to describe the processes that need to exist for a successful project. IT Governance or ICT Governance, is a subset discipline of Corporate Governance focused on information technology systems and their performance and risk management. The rising interest in IT governance is partly due to compliance initiatives (e.g. Sarbanes-Oxley), as well as the acknowledgement that IT projects can easily get out of control and profoundly affect the performance of an organization.
  • A) Tailoring identifies rigor which is a categorization of risk to identify SDM governance.
  • Here are some examples of tailoring that pertains to peer review. Notice reviews should be done all the time with low rigor being an Ad Hoc review for logical design, physical design, and testing.
  • For Prod Fixes the programmer with be the Author, Review Coordinator, & Recorder.
  • A) Type of reviews. Please review Walkthroughs.
  • The most informal type of Internal Review which has very little if any formal paper trail. Please help me fix this error!
  • A) An informal type of Internal Review which does not rely on a formal feedback meeting. B) A simple discussion and immediate feedback about code. Can you check out the logical design real quick!
  • The most formal of the internal review processes. Utilizes Fagan Style Inspection techniques. Fagan inspection refers to a structured process of trying to find defects in development documents such as programming code, specifications, designs and others during various phases of the software development process. The operations involved are planning, overview, preparation, inspection meeting, rework, & follow-up. – Wikipedia website
  • An informal type of Internal Review where the work product is sent to multiple reviewers and informal feedback is received without holding a feedback meeting. Here is my stuff and please let me know what you all think when you look at it!
  • A) The second most formal of the internal review processes. Less formal than Inspections. More formal than a Desk Check, Pass Around or Ad Hoc Review. Walkthroughs are sometimes used to educate reviewers and to gain consensus on solutions, as well as to identify issues. B) Lets schedule a meeting and walkthrough this code! Here is the code ahead of time.
  • A) Actual Peer Review documentation and how you fill it out.
  • A) Actual Peer Review documentation and how you fill it out.
  • Actual Peer Review documentation and how you fill it out.
  • A) Actual Peer Review documentation and how you fill it out.
  • A) Actual Peer Review documentation and how you fill it out.
  • A) Actual Peer Review documentation and how you fill it out.
  • A) Thank you!

BlueCross: Peer Review Presentation Transcript

  • 1. Peer Review SDM - Solutions Delivery Methodology by James Didier
  • 2. Goals
    • Brief SDM introduction
    • Learn how to search for SDM documentation
    • How to do a Peer Review for Prod Fix
  • 3. SDM – Over 80 Hours Required
    • Enterprise Driven Improvement Business Functionality Driven Efforts ITG Project Sub-Portfolio
    • < = 80 hours use ITSM Change Management and Release Compliance
    • > 80 hours use SDM & ITSM Change Management for production move
    • Enhancements of Environments
    • < = 80 hours use ITSM Change Management and Release Compliance
    • > 80 hours use SDM & ITSM Change Management for production move
    • Business Operations
    • Use ITSM Change Management and Release Compliance
    • Critical IT Support
    • Use ITSM Change Management and Release Compliance
    • IT Production
    • Use ITSM Change Management and Release Compliance
    • ITG Admin
    • Not applicable
    • Training
    • Not applicable
    • (Reference: Solution Delivery Methodology (SDM) & IT Service Management (ITSM) Governance & Usage Standard )
  • 4. 3i Blue Process
    • 3i Blue Process – Project Portfolio Management Process that prioritizes key initiatives.
    • Project Management SDM Responsible
    • Tier 1 >$1M
    • Tier 2 $250K to $1M
    • YOU Are SDM Responsible!
    • Tier 3 $30K to $250K
    • Tier 4 < $30K or < 400 hours
    • (Reference: PM0002-STD-Project Categorization )
  • 5. (Reference: V&V0002-PRO-Testing and Review Processes)
  • 6. Development – Waterfall vs. Iterative
    • The waterfall model is a sequential software development model (a process for the creation of software) in which development is seen as flowing steadily downwards (like a waterfall) through the phases of requirements analysis, design, implementation, testing (validation), integration, and maintenance. The origin of the term &quot;waterfall&quot; is often cited to be an article published in 1970 by W. W. Royce. Royce himself advocated an iterative approach to software development and did not even use the term &quot;waterfall&quot;. Royce originally described what is now known as the waterfall model as an example of a method that he argued &quot;is risky and invites failure&quot;.
    • Iterative and Incremental development is a software development process developed in response to the weaknesses of the more traditional waterfall model. The basic idea behind iterative enhancement is to develop a software system incrementally, allowing the developer to take advantage of what was being learned during the development of earlier, incremental, deliverable versions of the system. The two most well known iterative development frameworks are the Rational Unified Process and the Dynamic Systems Development Method. Iterative and incremental development is also an essential part of Extreme Programming and all other agile software development frameworks.
    • Iterations increase with more complex requirements are usually broken into manageable chunks by Use Cases identifying key system features through Requirement Engineering. Change Management manages change within an ongoing project through base-lining.
    • Iterations decrease:
      • Light, Straightforward Functional Requirements -- Easy to understand what is needed, without putting in the format of scenarios with sequences of actions
      • Mainframe -- Minimal user interface
      • Highly Computational / Algorithmic / Scientific -- Minimal user interface
      • Embedded Controls -- Minimal user interface
    • (Reference: Wikipedia webpage & RE0001-STD-Requirements Engineering Guidebook )
  • 7. Project Failure
    • Failure Statistics:
    • A 2001 study performed by M. Thomas in the U.K. of 1,027 projects showed that scope management related to attempting waterfall practices, including detailed, up-front requirements, was cited by 82 percent of failed projects as the number one cause of failure . 
    • This is backed up by other research – according to Jim Johnson of the Standish Group when requirements are specified early in the lifecycle that 80% of the functionality is relatively unwanted by the users.  He reports that 45% of features are never used, 19% are rarely used, and 16% are sometimes used.
    • Reasons for Failure:
    • When project stakeholders are told that they need to get all of their requirements down on paper early in the project, they desperately try to define as many potential requirements (things they might need but really aren't sure about right now) as they can.  They know if they don't do it now then it will be too hard to get them added later because of the change management/prevention process which will be put in place once the requirements document is base-lined.
    • Things change between the time the requirements are defined and when the software is actually delivered. 
    • (Reference: www.agilemodeling.com/essays/agileRequirementsBestPractices.htm )
  • 8.  
  • 9. Why do Peer Reviews?
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
    • The goal of these reviews is to prevent defects from passing to later phases.
    • Early detection improves product quality and fosters a reduction in cost , rework, and production support.
    • Additionally, reviews enable knowledge transfer between team members; the process of reviewing another team member’s work provides an opportunity to become familiar with other technical and/or functional areas.
  • 10. 1 Mandatory Review Document for Production Fixes
    • V&V0022-TMPL-Review Summary Report Form (Peer Review - Mandatory)
    • V&V0020-STD-Standard Procedures for Internal Review (Peer Review - Instructional)
      • All SDM Review Documents
    • 1) V&V0002-PRO-Testing and Review Processes-SDM-0002
    • 2) V&V0021-TMPL-Review Issue Log-SDM-0021
    • 3) V&V0022-TMPL-Review Summary Report Form-SDM-0022
    • 4) V&V0025-STD-Standard Procedures for Stakeholder Review-SDM-0025
    • 5) V&V0030-TMPL-Standard Review Checklists-SDM-0030
    • 6) CM0004-PRO-Project Testing Defect Mgmt Process-SDM-0004
    Only 1 . . . don’t be fooled!
  • 11. Production Fix Review
    • At least one successful review is required.
    • The programmer is responsible for giving any supporting documentation to the reviewer if asked.
    • If the review fails do not signoff and just give feedback.
    • Signoff is not a rubber stamp! The reviewer will now share the responsibility, with the programmer, for any adverse impact.
  • 12. Document Process Types
    • Type Name
    • CM Configuration Management
    • MA Measurement & Analysis
    • PI Process (SDM) Infrastructure
    • PM Project Management
    • PS Protection Services
    • QA Quality Assurance
    • RE Requirements Engineering
    • SA Solution Architecture
    • SAM Supplier Agreement Management
    • TM Testing Management
    • VV Verification & Validation
    • (Reference: PI0007-STD-Document Naming and Numbering )
  • 13. Document Categories
    • Category Description
    • POL Policy
    • PRC Procedure
    • PRO Process
    • STD Standard
    • DN Document Names as defined by the Project Communication Management Plan
    • Guide Guideline
    • Plan A document that outlines how the project will accomplish certain activities: e.g. Project plan, requirements plan, implementation plan
    • TMPL A document used as a starting point for project deliverables
    • (Reference: PI0007-STD-Document Naming and Numbering )
  • 14. Peer Review Form
  • 15. Review Procedures
  • 16. PAL
  • 17. Balance - Agile Software Development vs. SDM
    • Agile Software Development
    • There are a number of agile software development methods, such as those espoused by the Agile Alliance. Most agile methods attempt to minimize risk by developing software in short timeboxes, called iterations, which typically last one to four weeks. Each iteration is like a miniature software project of its own, and includes all of the tasks necessary to release the mini-increment of new functionality: planning, requirement analysis, design, coding, testing, and documentation. While an iteration may not add enough functionality to warrant releasing the product, an agile software project intends to be capable of releasing new software at the end of every iteration. In many cases, software is released at the end of each iteration. This is particularly true when the software is web-based and can be released easily. Regardless, at the end of each iteration, the team reevaluates project priorities.
    • Agile methods emphasize real-time communication, preferably face-to-face, over written documents.
    • Agile methods also emphasize working software as the primary measure of progress. Combined with the preference for face-to-face communication, agile methods produce very little written documentation relative to other methods. This has resulted in criticism of agile methods as being undisciplined.
    • Only do documentation when it adds value to the business!
    • Solution Delivery Methodology
    • Principles, and proven practices intended to enable developers to achieve success in the software development life cycle (SDLC) while “documentation used as evidence” to ensure the adherence of best practices. – 4/12/2007 SDM Meeting
    • Result is tailoring!
    • (Reference: Wikipedia webpage )
  • 18. Tailoring
    • When planning the internal review activities project
    • managers and test managers should work together to
    • understand level of risk that the project holds with
    • respect to the quality of the work products under review. Projects
    • may tailor their internal reviews based upon several factors
    • including:
    • The level of previous defects discovered
    • The experience and knowledge level of the project team
    • The impact that undiscovered defects might cause the project
    • (Reference: V&V0002-PRO-Testing and Review Processes )
  • 19. Tailoring
    • (Reference: V&V0002-PRO-Testing and Review Processes )
    Low Rigor Medium Rigor High Rigor General Tailoring Guidance Testing & Review Task Desk Check or Passaround Required Walkthrough Required Walkthrough or Review Tier 4-Minimize need for formal meeting but require completion of review documentation. Overall Test & Review Planning (slide 5, #5) Risk-dependent Review on modules and at least Ad hoc review on every module Required Review on modules and at least Desk Check on every module Required Review on modules and at least Desk Check on every module Tier 4-Minimize need for formal meeting but require completion of review documentation. Code, Unit Test Cases & Plan Review (slide 5, #30)
  • 20. Peer Review Roles
    • Author
      • Prepare the work product for review
      • Assemble the documentation
      • Gather feedback
      • Fix the work product
      • Get Review Coordinator approval, if necessary, before base-lining the work product
    • Supervisor/Team Coordinator
      • Assign individual roles, collect documentation, and ensure SDM governance
    • Review Coordinator
      • Advise on time needed for the review and is responsible for its execution
      • Review the work product before distribution, manage feedback, and report to the team coordinator
    • Recorder
      • Act like an additional reviewer and log issues
    • Reviewers
      • Examine the work product to find defects and suggest improvements
      • (Reference: ISPI Deliverable #TS0046.01 Document No. TS-032-036 )
  • 21. Review Vocabulary
    • Ad Hoc Review - The most informal type of Internal Review which has very little if any formal paper trail.
    • Desk Checks - An informal type of Internal Review which does not rely on a formal feedback meeting.
    • Peer Review - See Internal Review
    • Internal Review - Review of a work product to verify its compliance to organizational standards and requirements. Also known as Peer Reviews.
    • Inspections - The most formal of the internal review processes. Utilizes Fagan Style Inspection techniques.
    • Pass Around - An informal type of Internal Review where the work product is sent to multiple reviewers and informal feedback is received without holding a feedback meeting.
    • Stakeholder Review - Review of a work product to validate that it meets its intended need or that the future work products derived from it will meet the customers’ intended need.
    • Walkthrough - The second most formal of the internal review processes. Less formal than Inspections. More formal than a Desk Check, Pass Around or Ad Hoc Review. Walkthroughs are sometimes used to educate reviewers and to gain consensus on solutions, as well as to identify issues.
    • (Reference: V&V0002-PRO-Testing and Review Processes )
  • 22. Ad Hoc Review
    • Ad Hoc Reviews are the most informal review method used to
    • capture work product issues. Key principles of Ad Hoc Reviews are:
    • Provides a quick way to get another perspective that often finds errors we cannot see ourselves.
    • Has little impact beyond solving the immediate problem.
    • Has almost no paper trail or metrics to ensure it was executed effectively.
    • Should be used sparingly as the only review method for a work product.
    • Better used in conjunction with other internal review methods when a potential reviewer is unable to participate fully, but their involvement is critical.
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
  • 23. Desk Checks
    • Desk Checks are an informal review method used to capture work product
    • issues without the formality of a Feedback Meeting. Key principles of Desk
    • Checks are:
    • Only one person besides the Author examines the work product
    • Depends entirely on the single reviewer’s knowledge, skill, and self-discipline
    • Can be fairly formal if the reviewer employs defect checklists, specific analysis methods and standard forms to keep records for the team’s review metrics collection.
    • Lowest cost/impact review approach
    • Limited data collection besides issue data and effort spent reviewing.
    • This method is suitable if:
      • Colleagues are skilled at finding defects on their own
      • Severe time and resource restrictions exist (i.e., Tier 4 Project)
      • Product is low-risk
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
  • 24. Inspections
    • Inspections are the most formal of the internal review processes. Key
    • principles of inspections include the following:
    • Each reviewer must be sufficiently prepared with their list of issues.
    • Reviewers should focus solely on the identified issues during the Feedback Meeting.
    • Authors do not typically address resolutions to the issue but only ask clarifying questions for the reviewer during the feedback meeting.
      • The focus on issues and not resolution allows plenty of time for identifying issues, which is the key function of inspections.
      • Solutions to individual issues may be investigated without the entire group, again saving overall review time.
      • By focusing on issues and not resolution it does not allow the Author to become defensive.
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
  • 25. Pass Arounds
    • Pass Arounds are an informal review method used to capture work product
    • issues without the formality of a Feedback Meeting. Key principles of Pass
    • Arounds are:
    • Multiple, concurrent Desk Checks
    • Mitigates the issue of a reviewer providing timely feedback by having a larger sample of reviewers.
    • Mitigates the issue of a reviewer not identifying every issue.
    • Lacks the mental stimulation that a group discussion of issues may provide.
    • Identifying the right reviewers who can provide the targeted feedback required.
    • Clear documentation of issues by the reviewers so limited clarification is required by the Author.
    • Limited data collection besides issues and effort spent reviewing.
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
  • 26. Walkthroughs – Important!
    • Walkthroughs are the second most formal of the internal review
    • processes after inspections. Key principles of walkthroughs are:
    • The Review Coordinator or the Author leads the review team through an overall discussion of the work product and addresses issues as each section is presented.
    • The Feedback Meeting may have an alternate use to educate reviewers and gain consensus on solutions, as well as identify issues.
    • Each reviewer must come prepared with their list of issues.
    • Authors may explore solutions to the identified issues but time constraints for the meeting should be enforced.
    • (Reference: V&V0020-STD-Standard Procedures for Internal Review )
  • 27. Peer Review - Signoff Information Technology Group Illinois Membership - Code Internal Peer Review Version – v. 00.01.01 Last Updated – 02/02/07 (Reference: V&V0022-TMPL-Review Summary Report Form) Authorized Approval By: Mike Ditka Prepared By: James Didier Effective Date: 02/02/2007 Release Date: 02/02/2007 Document No: V&V0132-TMPL-IL Membership Internal Peer Code Review Summary Report-Unique BIN_PCN-0132-v00.01.01.doc Type: TEMPLATE Title: Review Summary Report Form
  • 28. Peer Review - Signoff Revision History Review Summary Report Form (Reference: V&V0022-TMPL-Review Summary Report Form) James Didier Initial SDM & final version. 00.01.01 02/02/2007 Author Description Version Date 02/02/2007 Meeting Date 9748 DM # P00000194 Project ID / # Unique BIN/PCN Project Title Review Identification:
  • 29. Peer Review - Signoff (Reference: V&V0022-TMPL-Review Summary Report Form) Reviewer .25 Walter Payton Walter Payton Reviewer .25 William Perry William Perry Reviewer .25 James Didier James Didier Recorder .25 James Didier James Didier Review Coordinator .25 James Didier James Didier Author Preparation Time (hrs) Signature Reviewers THIS MEAT062 COBOL PROGRAM WILL LOAD THE DRG_BLUE_SCRPT TABLE IN A DAILY PROC USING OPER.GCPS.GC8000.PRESDRUG.GROUP.SECT.TEMP INPUT FOR THE UNIQUE BIN-PCN PROJECT. MEND026 IS A MAGNA MODULE THAT WILL BE DEPENDENT ON THE DRG_BLUE_SCRIPT TABLE. Work Product Description
  • 30. Peer Review - Signoff (Reference: V&V0022-TMPL-Review Summary Report Form) Documents Code Hours 1.25 Total: Hours 0 Follow-up Phase: Hours 0 Modification Phase: 02/02/2007 Actually Reviewed Hours .75 Feedback Phase: 02/02/2007 Planned for Review Hours .25 Preparation Phase: Lines of Code X Hours .25 Planning Phase: Pages Review Data
  • 31. Peer Review - Signoff (Reference: V&V0022-TMPL-Review Summary Report Form) A significant fraction of the planned material was not reviewed, or the review was terminated for some reason. Not Accepted – Review not completed A substantial portion of the product must be modified, or there are many changes to make. A second review is required after the author has completed modification (i.e., rework). Not Accepted – Review following modification Defects must be corrected, and the changes must be verified by the individual named on the Inspection Summary Report. Accepted Conditionally upon verification Modifications may be required in the work product, but verification of the modification is not necessary. Accepted As Is Meaning Disposition Review not completed Conditionally upon verification Review following modification: As is X Not accepted Accepted X Review Disposition
  • 32. Peer Review - Signoff (Reference: V&V0022-TMPL-Review Summary Report Form) I thought you did an excellent job all the documentation was clear and easy to follow. Review Minutes (attach review issues logs separately) Assigned Action Required Action Items (specific actions beyond resolution of issues) 02/02/2007 Completion Date James Didier Review Coordinator Issue Resolution:
  • 33. Thank You