Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

systemverilog_workshop (Verification Planning)

1,834 views

Published on

Agenda:
Verification Plan
Building Testbench
Writing Tests
Integrating Code Coverage
Analyze Coverage

Published in: Devices & Hardware

systemverilog_workshop (Verification Planning)

  1. 1. SystemVerilog For Verification Training Verification Planning and Management Session 9 Sameh El-Ashry Hardware Verification Engineer
  2. 2. 14 Oct 2015 Verification Planning and Management
  3. 3. 14 Oct 2015 PHASES OF VERIFICATION Analyze Coverage Integrating Code Coverage Writing Tests Building Testbench Verification Plan
  4. 4. 14 Oct 2015 Detailed Phases Of Verification
  5. 5. 14 Oct 2015 Verification Plan Concept  In test plan, we prepare a road map for how do achieve the goal, it is a living document. Test plan includes, introduction, assumptions, list of test cases, list of features to be tested, approach, deliverables, resources, risks and scheduling, entry and exit criteria. Test plan helps verification engineer to understand how the verification should be done. A test plan could come in many forms, such as a spreadsheet, a document or a simple text file. Sometimes, test plan simply reside in the engineer's head which is dangerous in which the process cannot be properly measured and controlled. Test plan also contains the descriptions of TestBench architecture and description of each component and its functionality.
  6. 6. 14 Oct 2015 Building Testbench  In this phase, the verification environment is developed. Each verification component can be developed one by one or if more than one engineer is working it can be developed parallel. Writing the coverage module can be done at any time. It is preferred to write down the coverage module first as it gives some idea of the verification progress.
  7. 7. 14 Oct 2015 Writing Tests  After the TestBench is built and integrated to DUT, it's time for validating the DUT. Initially in Constrained Driven Verification, the test are ran randomly till some 70 % of coverage is reached or no improvement in the coverage for 1 day simulation. By analyzing the coverage reports, new tests are written to cover the holes. In these tests, randomization is directed to cover the holes. Then finally, the hard to reach scenarios, called as corner cases have to be written in directed verification fashion. Of course, debugging is done in parallel and DUT fixes are done.
  8. 8. 14 Oct 2015 Integrating Code Coverage && Analyze Coverage  Once you have achieved certain level of functional coverage, integrate the code coverage. For doing code coverage, the code coverage tools have option to switch it on. And then do the simulation, the tool will provide the report.  Finally analyze both functional coverage and code coverage reports and take necessary steps to achieve coverage goals. Run simulation again with a different seed, all the while collecting functional coverage information.
  9. 9. 14 Oct 2015 Verification Plan Here the start
  10. 10. 14 Oct 2015 Case Study: ONES COUNTER EXAMPLE  Following example is TestBench for ones counter.  It has some verification components which are required.  The intention of showing this example is to make you familiar with some steps required while building verification environment and to help you to understand the flow of System Verilog environment.
  11. 11. 14 Oct 2015 Specification of ones counter  Ones Counter is a Counter which counts the number of one's coming in serial stream.  The Minimum value of the count is "0" and count starts by incriminating one till "15". After "15" the counter rolls back to "0".  Reset is also provided to reset the counter value to "0". Reset signal is active negedge.  Input is 1 bit port for which the serial stream enters. Out bit is 4 bit port from where the count values can be taken. Reset and clock pins also provided.
  12. 12. 14 Oct 2015 4 bit counter using four D flip-flops Count[0] Count[1] din
  13. 13. 14 Oct 2015 The following is the RTL code of ones counter with bugs module dff(clk,reset,din,dout); input clk,reset,din; output dout; logic dout; always@(posedge clk,negedge reset) if(!reset) dout <= 0; else dout <= din; endmodule module ones_counter(clk,reset,data,count); input clk,reset,data; output [0:3] count; dff d1(clk,reset,data,count[0]); dff d2(count[0],reset,~count[1],count[1]); dff d3(count[1],reset,~count[2],count[2]); dff d4(count[2],reset,~count[3],count[3]); endmodule
  14. 14. 14 Oct 2015 Verification Test Plan Template Testcase Name Description Section # in standard Type (positive/ne gative) Status (pass/fail/ hang)
  15. 15. 14 Oct 2015 Test Plan For Ones Counter Testcase Name Description Section # in standard Type (positive/nega tive) Status (pass/fail/ hang) test1.v Count should increment from "0" to "15". ( Coverage item) test2.v Count should roolover to "0" after "15". (Coverage transition) test3.v Reset should make the output count to "0", when the count values is non "0". ( Assertion coverage)
  16. 16. 14 Oct 2015 Verification Environment Hierarchy  TOP |-- Clock generator |-- Dut Instance |-- Interface |-- Assertion block instance ( assertion coverage) |-- Testcase instance |-- Environment |-- Driver | |-- Stimulus | |-- Covergroup |-- Monitor |-- Scoreboard
  17. 17. 14 Oct 2015 SystemVerilog Environment For Ones Counter DUT SystemVerilog Interface Scoreboard stimulus Driver Monitor Assertion monitor Env Test case Top reset Data Count Count
  18. 18. 14 Oct 2015 COVERAGE DRIVEN CONSTRAINT RANDOM VERIFICATION ARCHITECTURE (CDRV)  Basic functionality of CDRV Environment:  Input side of DUT : -- Generating traffic streams -- Driving traffic into the design (stimuli)  Output side of DUT: -- Checking these data streams -- Checking protocols and timing  Collecting both the functional coverage and code coverage information.  Writing deterministic tests and random tests to achieve 100% coverage.
  19. 19. 14 Oct 2015 Verification Components Required For CDRV
  20. 20. 14 Oct 2015 Stimulus  When building a verification environment, the verification engineer often starts by modeling the device input stimulus.  In Verilog, the verification engineer is limited in how to model this stimulus because of the lack of high-level data structures.  Typically, the verification engineer will create a array/memory to store the stimuli. SystemVerilog provides high-level data structures and the notion of dynamic data types for modeling stimulus.  Using SystemVerilog randomization, stimulus is generated automatically. Stimulus is also processed in other verification components. SystemVerilog high-level data structures helps in storing and processing of stimulus in an efficient way.
  21. 21. 14 Oct 2015 Stimulus Generator  The generator component generates stimulus which are sent to DUT by driver. Stimulus generation is modeled to generate the stimulus based on the specification.  For simple memory stimulus generator generates read, write operations, address and data to be stored in the address if its write operation.  Scenarios like generate alternate read/write operations are specified in scenario generator.  SystemVerilog provided construct to control the random generation distribution and order.  Generally generator should be able to generate every possible scenario and the user should be able to control the generation from directed and directed random testcases
  22. 22. 14 Oct 2015 Transactor  Transactor does the high level operations like burst-operations into individual commands, sub-layer protocol in layered protocol like PciExpress Transaction layer over PciExpress Data Link Layer, TCP/IP over Ethernet etc.  It also handles the DUT configuration operations. This layer also provides necessary information to coverage model about the stimulus generated.  Stimulus generated in generator is high level like Packet is with good crc, length is 5 and da is 8~Rh0. This high level stimulus is converted into low level data using packing. This low level data is just a array of bits or bytes. Packing is an operation in which the high level stimulus values scalars, strings, array elements and struct are concatenated in the specified manner.
  23. 23. 14 Oct 2015 Driver  The drivers translate the operations produced by the generator into the actual inputs for the design under verification. Generators create inputs at a high level of abstraction namely, as transactions like read write operation. The drivers convert this input into actual design inputs, as defined in the specification of the designs interface. If the generator generates read operation, then read task is called, in that, the DUT input pin "read_write" is asserted.
  24. 24. 14 Oct 2015 Monitor  Monitor reports the protocol violation and identifies all the transactions. Monitors are two types, Passive and active. Passive monitors do not drive any signals. Active monitors can drive the DUT signals. Sometimes this is also refered as receiver. Monitor converts the state of the design and its outputs to a transaction abstraction level so it can be stored in a 'score-boards' database to be checked later on. Monitor converts the pin level activities in to high level.
  25. 25. 14 Oct 2015 Assertion Based Monitor  Assertions are used to check time based protocols, also known as temporal checks. Assertions are a necessary compliment to transaction based testing as they describe the pin level, cycle by cycle, protocols of the design. Assertions are also used for functional coverage.
  26. 26. 14 Oct 2015 Data Checker  The monitor only monitors the interface protocol. It doesn't check the whether the data is same as expected data or not, as interface has nothing to do with the date. Checker converts the low level data to high level data and validated the data. This operation of converting low level data to high level data is called Unpacking which is reverse of packing operation. For example, if data is collected from all the commands of the burst operation and then the data is converted in to raw data , and all the sub fields information are extracted from the data and compared against the expected values. The comparison state is sent to scoreboard.
  27. 27. 14 Oct 2015 Scoreboard  Scoreboard is sometimes referred as tracker. Scoreboard stores the expected DUT output.  The stimulus generator generated the random vectors and is sent to the DUT using drivers. These stimuli are stored in scoreboard until the output comes out of the DUT.  When a write operation is done on a memory with address 101 and data 202, after some cycles, if a read is done at address 101, what should be the data?.The score board recorded the address and data when write operation is done. Get the data stored at address of 101 in scoreboard and compare with the output of the DUT in checker module.  Scoreboard also has expected logic if needed. Take a 2 inputs and gate. The expect logic does the "and " operation on the two inputs and stores the output".
  28. 28. 14 Oct 2015 Coverage  This component has all the coverage related to the functional coverage groups. Utilities  Utilities are set of global tasks which are not related to any protocol. So this module can be reused across projects without any modification to code. Tasks such as global timeout, printing messages control, seeding control, test pass/fail conditions, error counters etc. The tasks defined in utilities are used by all other components of the TestBench.
  29. 29. 14 Oct 2015 Environment  Environment contains the instances of all the verification component and Component connectivity is also done. Steps required for execution of each component is done in this. Tests  Tests contain the code to control the TestBench features. Tests can communicate with all the TestBench components. Once the TestBench is in place, the verification engineer now needs to focus on writing tests to verify that the device behaves according to specification.
  30. 30. 14 Oct 2015 Regression Concept  Regression is re-running previously run tests and checking whether previously fixed faults have re-emerged.  New bugs may come out due to new changes in RTL or DUT to unmasking of previously hidden bugs due to new changes.  Each time, when design is changed, regression is done. One more important aspect of regression is testing by generation new vectors. Usually the seed to generate stimulus is the system time.  Whenever a regression is done, it will take the current system time and generate new vectors than earlier tested.  This way testbench can reach corners of DUT.
  31. 31. 14 Oct 2015 How much regression testing?
  32. 32. 14 Oct 2015 Coverage Model  Code coverage + Functional coverage Example of Functional Coverage Result
  33. 33. 14 Oct 2015 Example of Code Coverage Result
  34. 34. 14 Oct 2015 Code Coverage  To check whether the Testbench has satisfactory exercised the design or not? Coverage is used. It will measure the efficiency of your verification implementation. Code coverage answers the questions like Have all the lines of the DUT has been exercised? Have all the states in the FSM has been entered? Have all the paths within a block have been exercised? Have all the branches in Case have been entered? Have all the conditions in an if statement is simulated?
  35. 35. 14 Oct 2015 Functional Coverage  Functional coverage answers questions like Have all the packets length between 64 to 1518 are used? Did the DUT got exercised with alternate packets with good and bad crc? Did the monitor observe that the result comes with 4 clock cycles after read operation? Did the fifos are filled completely?  Summary of functional coverage advantages: Functional coverage helps determine how much of your specification was covered. Functional coverage qualifies the testbenchs. Considered as stopping criteria for unit level verification. Gives feedback about the untested features. Gives the information about the redundant tests which consume valuable cycle. Guides to reach the goals earlier based on grading.
  36. 36. 14 Oct 2015 Verification IP How there are companies working in verification services only ?! Benefits ?
  37. 37. 14 Oct 2015 VIP Example
  38. 38. 14 Oct 2015 Tracking the Simulation Process
  39. 39. 14 Oct 2015 Testsuite optimization in verification  Research task.
  40. 40. Thank You ! Presented by Sameh El-Ashry samehelashry@ieee.org https://eg.linkedin.com/pub/sameh-el-ashry/3b/560/22b

×