Successfully reported this slideshow.
Upcoming SlideShare
×

# Stephan berg track f

960 views

Published on

Published in: Education, Technology, Design
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

### Stephan berg track f

1. 1. Using Algorithmic Test Generation to improve Functional Coverage in existing Verification Environments Staffan Berg Verification Specialist Mentor Graphics (Europe)
2. 2. What’s the Problem ? Current Simulation Stimuli Generation Techniques Can’t Keep Up With Design Complexity How Can We Take Advantage of New Technology Without Breaking Existing Flows/Environments?
3. 3. Limitations of C-R Stimuli Generation To Produce N unique Vectors C-R Requires N * ln(N) Vectors For many of today’s complex designs, it is Impossible to reach Coverage goals within the project schedules
4. 4. Current Test Generation Technology <ul><ul><li>Explicitly describe every test procedurally </li></ul></ul><ul><ul><li>addr <= “0000”; wait 10; addr <= “0001”; </li></ul></ul>Directed Tests <ul><ul><li>Implicitly define the stimuli space by declaration of Constraints </li></ul></ul><ul><ul><li>Constraint c1 (addr inside {2, 4, [80:100]}; </li></ul></ul>C-R Tests <ul><ul><li>Explicitly Define the whole space by defining a Grammar through a set of Rules </li></ul></ul><ul><ul><li>Transaction = (req read | write wait_ack); </li></ul></ul>Algorithmic Tests
5. 5. Algorithmic Test Generation Basics Rule_graph simple_engine { action init; action wait_rdy, setup_rd, setup_wr, ack; action rw_1, rw_2, rw_4; symbol rw_opts, rw_size; rw_opts = setup_rd | setup_wr; rw_size = rw_1 | rw_2 | rw_4; simple_engine = init repeat { wait_rdy rw_opts rw_size ack } }
6. 6. Real Example: AMBA AHB Master
7. 7. Key Concepts <ul><li>The Rules are compiled into an NDFSM (non-Deterministic Finite State Machine) Representation </li></ul><ul><li>Action Functions are written in Verilog, SystemVerilog, VHDL or C++ </li></ul><ul><li>The Rule Graph is then traversed during simulation and </li></ul><ul><li>the Action functions are called to produce stimuli </li></ul><ul><li>Without coverage goals, the traversal will be random </li></ul>
8. 8. The Role of Coverage <ul><li>Stimulus model describes valid inputs </li></ul><ul><li>Coverage model describes interesting/priority stimulus </li></ul><ul><li>Directed Tests </li></ul><ul><ul><ul><li>Accurate but low productivity </li></ul></ul></ul><ul><ul><ul><li>Difficult to produce enough tests </li></ul></ul></ul><ul><ul><ul><li>Constrained Random </li></ul></ul></ul><ul><ul><ul><li>Automation boosts productivity </li></ul></ul></ul><ul><ul><ul><li>But, difficult to target </li></ul></ul></ul><ul><ul><ul><li>Redundancy slows coverage closure </li></ul></ul></ul><ul><ul><ul><li>Requires manual analysis and constraining to close coverage </li></ul></ul></ul><ul><li>Algorithmic Test Generation </li></ul><ul><ul><li>Eliminates redundant stimulus </li></ul></ul><ul><ul><li>Efficiently targets coverage model </li></ul></ul><ul><ul><li>Produces stimulus outside coverage model after coverage achieved </li></ul></ul>
9. 9. Using Coverage to drive Stimuli Generation Path Coverage is used to define the Coverage goals A single Path Coverage Object can cover all legal paths in a graph…. Or you could use multiple PC Objects to cover specific goals and cross products
10. 10. ATG and OVM DUT <ul><li>Fundamentals of OVM </li></ul><ul><li>Highly Modular Testbench Components </li></ul><ul><li>TLM-Based Communication </li></ul><ul><li>High Degree of Configurability & Re-Use </li></ul><ul><li>What Is OVM? </li></ul><ul><li>Open Verification Methodology </li></ul><ul><li>Joint Development by Cadence and Mentor </li></ul><ul><li>7400+ Registered Users </li></ul>
11. 11. Integration in Existing Testbench Environment: SV OVM OVM Testbench (Partial) Meta-Actions selects values from ranges or sets Rules
12. 12. Integration in Existing Testbench Environment: SV OVM Declare Sequence Item Create a Sequence Item Assign Values Send Item to Sequencer Modify Environment
13. 13. Integration in Existing Environment: ‘e’ Testbench Rule Graph C obj C api Sn_compile ‘ e’ Unit or Struct ‘ e’ Verify Unit graph_1 : infact_e_comp is instance; Verify_1() @posedge_clk is{ var req: trans_struct = new; while (TRUE) do { graph_1.fill_item(req); ‘ e’ struct
14. 14. ‘ e’ Integration Specifics <ul><li>Testengine is Untimed </li></ul><ul><li>The Action Functions can be Completely </li></ul><ul><li>Auto-Generated </li></ul><ul><li>‘ e’ Enumerated Types are mapped to Action Functions : </li></ul><ul><li>type instruction_t : [add, sub, mult]; </li></ul><ul><li>Minimal Changes to existing ‘e’ environment: </li></ul>
15. 15. Case Study: Wireless Infrastructure <ul><li>Complex Interface with 1000’s of configurations </li></ul>Complex ‘e’ testbench representing 100’s of man-years Impossible to achieve Coverage goals within reasonable time using C-R inFact DUT Specman TB Struct(s) eVC(s) BFM MON. Seq. Driver(s) Action function Action function Action functions TB uVC
16. 16. Case Study: Wireless Infrastructure Results: ATSG Achieved Coverage Goals in 1/10 th of the simulation time* (less than 100K tests vs. 850K tests) *Using Algorithmic Test Generation in a Constrained Random Test Environment Håkan Askdal DAC Conference 2009
17. 17. Case Study: AXI Bus Bridge Existing Design and Verification Environment <ul><li>DUT Overview </li></ul><ul><li>Parameterizable N-Channel Bus Bridge </li></ul><ul><li>AXI bus control interface </li></ul><ul><li>Arbitrates requests from proprietary interface, performs splits or aggregation as appropriate, and generates AXI bus calls </li></ul><ul><li>Verification Environment </li></ul><ul><li>Simulator - VCS </li></ul><ul><li>Testbench - SystemVerilog </li></ul><ul><li>Stimulus - Constrained Random </li></ul><ul><li>Current Testbench </li></ul><ul><li>Generation of 20 Random Variables </li></ul><ul><li>Interdependencies described with constraints </li></ul><ul><li>Cover-points on all variables </li></ul><ul><li>Crosses between select cover-points </li></ul> Proprietary I/F VIP AXI Bus VIP Coverage Scoreboard Sequence Generator Testbench Parameterizable Bus Bridge AXI Bus Interface N Request Channels DUT
18. 18. Case Study: AXI Bus Bridge Existing Verification Objectives <ul><li>Functional Domain Space (White Column) </li></ul><ul><li>Too many legal combinations to simulate </li></ul><ul><li>Not all are important anyway </li></ul><ul><li>Coverage Domain Space (Green Column) </li></ul><ul><li>Reduced to manageable number </li></ul><ul><li>Number of crosses will be limited </li></ul><ul><li>But still too many combinations </li></ul><ul><li>Verification Goals </li></ul><ul><li>Run constrained random tests until test conditions are achieved </li></ul><ul><li>Goal # 1 - Cover each value of each variable one time </li></ul><ul><li>Goal # 2 - Cover a cross of all combinations of the bytes and addr variables </li></ul> Variable Field trans phys addr id1 id2 bytes pri wrap start end seq1 seq2 offset1 offset2 res cache type1 ctrl1 type2 ctrl2 Functional Domain 5 2 2^36 256 256 65,536 2 2 2 2 2^32 2^32 16 16 4 2 4 8 4 4 Coverage Domain 5 2 255 64 64 776 2 2 2 2 64 64 16 16 4 2 4 8 4 4
19. 19. <ul><li>Verification Goal # 1 </li></ul><ul><li>Cover each value of each variable one time </li></ul><ul><li>Total of 20 cover-bins </li></ul><ul><li>Largest cover-bin contains 776 cover-points </li></ul><ul><li>Total of 1360 cover-points </li></ul><ul><li>Verification Goal # 2 </li></ul><ul><li>Cover a cross of all combinations of the bytes and addr variables - minus a few select unimportant cases </li></ul><ul><li>bytes cover-bin contains 776 cover-points </li></ul><ul><li>addr cover-bin contains 255 cover-points </li></ul><ul><li>Total of 196,608 cross cover-points </li></ul>Case Study: AXI Bus Bridge Existing Verification Objectives Variable Field Functional Domain Coverage Domain trans phys addr id1 id2 bytes pri wrap start end seq1 seq2 offset1 offset2 res cache type1 ctrl1 type2 ctrl2 5 2 2^36 256 256 65,536 2 2 2 2 2^32 2^32 16 16 4 2 4 8 4 4 5 2 255 64 64 776 2 2 2 2 64 64 16 16 4 2 4 8 4 4  1360 #1 #2
20. 20. Case Study: AXI Bus Bridge Results with Existing Testbench <ul><li>Verification Results # 1 </li></ul><ul><li>Total cover-points - 1360 </li></ul><ul><li>Largest cover-bin - 776 </li></ul><ul><li>Coverage achieved - 100% </li></ul><ul><li>Testcases required - 475,500 </li></ul><ul><li>Verification Results # 2 </li></ul><ul><li>bytes cover-bin - 776 </li></ul><ul><li>addr cover-bin - 255 </li></ul><ul><li>Total crosses - 196,608 </li></ul><ul><li>Coverage achieved - 79% </li></ul><ul><li>Testcases required - 26,315,000 </li></ul> 100% Testcases Coverage 0 475,500 0 1360 CRT 0% 100% Testcases Coverage 0 26,315,000 0 196,608 CRT 0% 79%
21. 21. Case Study: AXI Bus Bridge Updated Verification Environment <ul><li>Design Under Test Overview </li></ul><ul><li>No Change </li></ul><ul><li>Verification Environment </li></ul><ul><li>Simulator - Replace VCS with Questa/inFact </li></ul><ul><li>Testbench - Stay with SystemVerilog </li></ul><ul><li>Stimulus - Add Graph at the top level </li></ul><ul><li>New Testbench </li></ul><ul><li>Almost No Change (see next slide) </li></ul> Proprietary I/F VIP AXI Bus VIP Coverage Scoreboard Sequence Generator Testbench Parameterizable Bus Bridge AXI Bus Interface N Request Channels DUT
22. 22. Case Study: AXI Bus Bridge ATSG Testbench Steps <ul><li>Graph Development </li></ul><ul><li>Describe the domain of each variable </li></ul><ul><li>Describe constraints on variable relationships </li></ul><ul><li>Describe variables and combinations to cover </li></ul><ul><li>Graph Integration </li></ul><ul><li>Replace call to SV “randomize( )” function with call to inFact “fill ( )” method </li></ul><ul><li>Testbench Development Effort </li></ul><ul><li>Total time required - less than 1 day </li></ul><ul><li>inFact code written - approximately 100 lines </li></ul><ul><li>SystemVerilog code written - less than 10 lines </li></ul><ul><li>Reuse of existing testbench code - 99.9% </li></ul>
23. 23. Case Study: AXI Bus Bridge Coverage Closure Results <ul><li>Verification Results # 1 </li></ul><ul><li>Total cover-points - 1360 </li></ul><ul><li>Largest cover-bin - 776 </li></ul><ul><li>Coverage achieved - 100% </li></ul><ul><li>Testcases required - 776 </li></ul><ul><li>Coverage closure acceleration - 612x </li></ul><ul><li>Verification Results # 2 </li></ul><ul><li>bytes cover-bin - 776 </li></ul><ul><li>addr cover-bin - 255 </li></ul><ul><li>Total crosses - 196,608 </li></ul><ul><li>Coverage achieved - 100% </li></ul><ul><li>Testcases required - 196,608 </li></ul><ul><li>Coverage closure acceleration - >>170x </li></ul> iTBA 100% Testcases Coverage 0 475,500 0 1360 CRT 0% 776 612 x iTBA 100% Testcases Coverage 0 26,315,000 0 196,608 CRT 0% 79% 1.15% 196,608
24. 24. Conclusions ATG can significantly shorten Time-to-Coverage ATG can be introduced while still preserving existing verification IP More Verification Less Cycles THANK YOU!