Verification Metrics

         Dave Williamson
CPU Verification and Modeling Manager

       Austin Design Center

       ...
Verification Metrics: Why do we care?

 Predicting functional closure of a design is hard

 Design verification is typical...
Two key types of metrics
  Verification test plan based metrics
    Amount of direct tests completed
    Amount of random ...
Challenges and limitations
 Limitations of test plan based metrics
   Will give a best case answer for completion date
   ...
Bug rate example
                                                                                   Bug History

         ...
Bug rate by unit example
                                                   Bug breakdown per design unit

 300




 250

...
Functional Coverage closure example




                          New coverage
                          points added




...
Upcoming SlideShare
Loading in …5
×

Williamson arm validation metrics

511 views
457 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
511
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Williamson arm validation metrics

  1. 1. Verification Metrics Dave Williamson CPU Verification and Modeling Manager Austin Design Center June 2006 1
  2. 2. Verification Metrics: Why do we care? Predicting functional closure of a design is hard Design verification is typically the critical path CPU design projects rarely complete on schedule Cost of failure to predict design closure is significant 2
  3. 3. Two key types of metrics Verification test plan based metrics Amount of direct tests completed Amount of random testing completed Number of assertions written Amount of functional coverage written and hit Verification reviews completed Health of the design metrics Simulation passing rates Bug rate Code stability Design reviews completed 3
  4. 4. Challenges and limitations Limitations of test plan based metrics Will give a best case answer for completion date The plan will grow as testing continues Limitations of health of the design based metrics Can give false impressions if used independent from test plan metrics Requires good historical data on similar project for proper interpretation General concerns to be aware of for all metrics What you measure will affect what you do Gathering metrics is not free Historical data can be misleading Don’t be a slave to the metrics: they are a great tool, but not the complete answer 4
  5. 5. Bug rate example Bug History 1200 20 Knee in curve 18 1000 16 Bug Rate Rolling Average 14 800 Total Bug Count 12 600 10 8 400 6 4 200 2 0 0 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 101 105 109 113 Week number Total Bug Count Weekly Bug Count (4wk rolling average) 5
  6. 6. Bug rate by unit example Bug breakdown per design unit 300 250 200 150 100 50 0 1 5 9 3 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 10 10 10 11 6
  7. 7. Functional Coverage closure example New coverage points added 7

×