Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
2019 2 testing and verification of vlsi design_verification
1. Introduction to
Verification of VLSI
Design
and
Functional Verification
1
DrUshaMehta02-08-2019
Dr Usha Mehta
usha.mehta@ieee.org
usha.mehta@nirmauni.ac.in
2. Acknowledgement…..
This presentation has been summarized from various
books, papers, websites and presentations on VLSI
Design and its various topics all over the world. I
couldn’t item-wise mention from where these large
pull of hints and work come. However, I’d like to
thank all professors and scientists who created such
a good work on this emerging field. Without those
efforts in this very emerging technology, these notes
and slides can’t be finished.
2
DrUshaMehta02-08-2019
8. Source of Errors
• Errors in Specification
• Unspecified Functionality
• Conflicting requirements
• Unrealized features
• No model for checking as it is at top of
abstraction hierarchy
• Errors in Implementation
• human error in interpreting design functionality
8
DrUshaMehta02-08-2019
9. How to reduce human
introduced errors in
interpretation?
• Automation
• Poka-Yoke
9
DrUshaMehta02-08-2019
10. • Automation
• The obvious way to reduce human
introduced error
• It is not always possible specially when the
processes are not well defined and requires a
human ingenuity and creativity.
• Poka-Yoke
• A Japanese term that means "mistake-
proofing" or “inadvertent error prevention”
• Towards the fool automation but not
complete automation
• Human intervention is needed only to decide
on the particular sequence or steps required
to obtain the desired results.
• Verification now a days remains an art.
10
DrUshaMehta02-08-2019
11. Redundancy
• Most costly but highly efficient approach
• Most widely used for ASICs
11
DrUshaMehta02-08-2019
12. Reconvergence Model
It consists the following steps:
1. Creating the design at a higher level of abstraction
2. Verifying the design at that level of abstraction
3. Translating the design to a lower level of abstraction
4. Verifying the consistency between steps 1 and 3
5. Steps 2, 3, and 4 are repeated until tapeout
The transformation can be any process like
:
• RTL coding from specification
• Insertion of a scan chain
• Synthesizing a RTL code into gate level netlist
• Synthesizing a gate level netlist in to lay out …..
12
DrUshaMehta02-08-2019
17. Functional Verification Approaches:
Black Box Approach
• Can not look into the design
• Functional verification to be performed
without any internal implementation
knowledge
• Through available interfaces only, no
internal state access
• Examples:
• Check a multiplier by supplying random
numbers to multiply
• Check a braking system by hitting the
brakes at different speeds
17
DrUshaMehta02-08-2019
18. Black Box…..
• Advantage
• Independent of implementation
• Verification process parallel with design
process
• Less efforts and time consumption
• Disadvantage
• Lack of visibility and controllability
• Difficult to set interesting state/combinations
• Difficult to locate the source of problem
• Difficulty rises when there is a long delay
between occurrence of a problem and its
symptom is visible
18
DrUshaMehta02-08-2019
19. Functional Verification Approaches
• White Box
• Intimate knowledge and controls of internals of a
design
• This approach can ensure that implementation
specific features behave properly
• Pure white box approach is being used at system
level where modules are treated like black boxes
but system itself is treated like white box.
• Grey Box
• Black box test cases written with full knowledge
of internal details.
• Mostly written to increase code coverage
19
DrUshaMehta02-08-2019
21. Test Bench
• TestBench mimic the environment in which the design
will reside.
• It checks whether the RTL Implementation meets the
design spec or not.
• This Environment creates invalid and unexpected as well
as valid and expected conditions to test the design.
• It does three functions:
• To generate the stimulus for simulation
• To apply this stimulus to the module under test and collect
output response
• To compare the output response with expected golden
values
21
DrUshaMehta02-08-2019
23. Input Generation
• Repetitive Input Generator
• Using specific syntax/code
• Using counter/LFSR etc
• Directed Input Generation
• By specifically writing the input pattern
• Using text file
• Very lengthy and time consuming method
• Very narrow but focused coverage
• Random Input Generation
• Using specific syntax/code
• Very simple and speedy process
• Very broad but shallow coverage
23
DrUshaMehta02-08-2019
29. How to check the results…
• Use waveform viewers for debugging
designs, not for testbench.
• Most of the operation in TestBench
executes in zero time, where waveform
viewer will not be helpful.
• Check in message window
• Store in the log file
29
DrUshaMehta02-08-2019
33. Limitations of Functional
Verification• Large numbers of simulation vectors are
needed to provide confidence that the design
meets the required specifications.
• Logic simulators must process more events
for each stimulus vector because of
increased design size and complexity.
• More vectors and larger design sizes cause
increased memory swapping, slowing down
performance
• Once the Behavioural design is verified, there
are many requirements for small non-
functional modifications in RTL.
• Ideally, after each such modification, there
must be a round of verification which is not
practical.
33
DrUshaMehta02-08-2019
34. Examples of Non-Functional
Changes in RTL of Design
• Adding clock gating circuitry for power reduction
• Restructuring critical paths
• Reorganizing logic for area reduction
• Adding test logic (scan circuitry) to a design
• Reordering a scan chain in a design
• Inserting a clock tree into a design
• Adding I/O pads to the netlist
• Performing design layout
• Performing flattening and cell sizing
34
DrUshaMehta02-08-2019
35. Formal Verification Methods
• Technique to prove or disprove the functional
equivalence of two designs.
• The techniques used are static and do not require
simulation vectors.
• You only need to provide a functionally correct, or “golden”
design (called the reference design),and a modified version
of the design (called the implementation).
• By comparing the implementation against the reference
design, you can determine whether the implementation is
functionally equivalent to the reference design
• Methods
• Equivalence Checking
• Modal Checking
35
DrUshaMehta02-08-2019
37. Linting
• It finds common programmer mistake
• It will allow programmer to find mistakes quickly and
efficiently very early instead of at the end waiting for full
programme to fail
• Checks for static errors or potential errors and coding
style guideline violations.
• Static errors: Errors that do not require input
vectors.
• E.g.
• A bus without driver,
• mismatch of port width in module definition and
instantiation.
• dangling input of a gate.
37
DrUshaMehta02-08-2019
38. Simulator
• Most common and familiar verification tool.
• Its role is limited to approximate reality.
• Simulators attempt to create an artificial universe that mimic the
future real design.
• This lets the designer interact with the design before it is
manufactured and correct flaws and problems earlier.
• Functional correctness and accuracy is a big issue as errors can not
be proven not to exist
• Simulator makes a computing model of the circuit, executes the
model for a set of input signals (stimuli, patterns, or vector),
and verifies the output signals.
• Limitations of simulation
• Timing issues with the simulator.
• The simulator can never mimic the real signal where actual electron
flows at a speed of light.
• Can’t be exhaustive for non-trivial designs
• Performance bottleneck 38
DrUshaMehta02-08-2019
39. Simulators
at different abstraction level
• System level –everything electrical, mechanical,
optical etc.
• Behavioral level – algorithm or data flow graph
by HDL
• Instruction set level – for CPU
• Register Transfer level + combinational level
• Gate level – gate as a basic element
• Switch level - transistor as a switch
• Circuit level - current and voltage parameter
• Device level - fabrication parameter
• Timing simulation – timing model
• Fault simulation- checks a test vector for fault
39
DrUshaMehta02-08-2019
40. RTL Level Simulators
Type: Event Driven
• Event: change in logic value at a node, at a certain
instant of time (V,T)
• Performs both timing and functional verification
– All nodes are visible
– Glitches are detected
• Most heavily used and well-suited for all types of designs
• Uses a timewheel to manage the relationship between
components
• Timewheel = list of all events not processed yet, sorted in
time (complete ordering)
• When event is generated, it is put in the appropriate
point in the timewheel to ensure causality
40
DrUshaMehta02-08-2019
41. RTL Level Simulators
Type: Cycle Based
• Take advantage of the fact that most digital
designs are largely synchronous (state
elements change value on active edge of
clock)
• Compute steady-state response of the
circuit
– at each clock cycle
– at each boundary node
• Only boundary nodes are evaluated
41
DrUshaMehta02-08-2019
Internal Node
Boundary Node
L
a
t
c
h
e
s
L
a
t
c
h
e
s
42. Comparison:
Event Driven vs. Cycle
Based• Cycle-based is 10x-100x faster than event-driven
(and less memory usage)
• Cycle-based does not detect glitches and
setup/hold time violations, while event-driven
does.
42
DrUshaMehta02-08-2019
• Cycle-based:
– Only boundary nodes
– No delay information
• Event-driven:
– Each internal node
– Need scheduling and
functions may be
evaluated multiple times
43. Common Simulators used in
Industry…
• NC-Sim
• Verilog-XL
• VCS
• Modelsim
• More…..
43
DrUshaMehta02-08-2019
44. Co-Simulators….
• VHDL-Verilog
• Analog-Digital
• Hardware-Software…
• Performance is reduced by communication and
Synchronization overhead.
• Translating events and values from one simulator to other
can create ambiguities
44
DrUshaMehta02-08-2019
45. Waveform Viewer
• It can play back the events that occurred
during the simulation that were recorded in
some trace file
• Recording waveform trace data is a
overburden on simulation and decreases its
performance
45
DrUshaMehta02-08-2019
46. Verification Matrices
• Code Coverage
• % of total code executed by given test cases
• Functional Coverage
• % of total functions executed by given test cases
46
DrUshaMehta02-08-2019
47. Code Coverage Tools
• To expose bugs, you should exercise as
many path as possible
• It shows which part of DUT is exercised
by testbench so it shows how good the
DUT is verified.
• To find new holes
• To measure the progress in test plan
• Bugs are often sensitive to branches and
conditions. For example, incorrectly
writing a condition such as i<=n rather
than i<n may cause a boundary error
bug.
47
DrUshaMehta02-08-2019
49. Statement / Line Coverage
• An indication of how many statements (lines) are covered in
the simulation, by excluding lines like module, endmodule,
comments, timescale etc.
• This is important in all kinds of design and has to be 100%
for verification closure.
• Statement coverage includes procedural statements
49
DrUshaMehta02-08-2019
50. Block Coverage
• A group of statements which are in the
begin-end or if-else or case or wait or while
loop or for loop etc. is called a block.
• The dead-code in design code can be found
by analyzing block coverage.
50
DrUshaMehta02-08-2019
51. Branch / Decision Coverage
• In Branch coverage or Decision coverage
reports, conditions like if-else, case and the
ternary operator (?: ) statements are
evaluated in both true and false cases.
51
DrUshaMehta02-08-2019
52. Condition / Expression
Coverage
• This gives an indication how well variables and expressions
(with logical operators) in conditional statements are
evaluated.
• Conditional coverage is the ratio of number of cases
evaluated to the total number of cases present.
• If an expression has Boolean operations like XOR, AND ,OR
as follows, the entries which is given to that expression to
the total possibilities are indicated by expression coverage.
52
DrUshaMehta02-08-2019
53. Toggle Coverage
• Toggle coverage gives a report that how
many times signals and ports are toggled
during a simulation run.
• It also measures activity in the design,
such as unused signals or signals that
remain constant or less value changes.
53
DrUshaMehta02-08-2019
54. State / FSM Coverage
• FSM coverage reports, whether the
simulation run could reach all of the states
and cover all possible transitions or arcs in
a given state machine.
• This is a complex coverage type as it works
on behaviour of the design, that means it
interprets the synthesis semantics of the
HDL design and monitors the coverage of
the FSM representation of control logic
blocks.
54
DrUshaMehta02-08-2019
55. Limitations of Code Coverage
• 100% code coverage is difficult to achieve
• Further, 100% Code coverage does not
prove that a design is functionally correct!
55
DrUshaMehta02-08-2019
56. Functional Coverage
• Code coverage measures how much of
the implementation has been exercised
• functional coverage measures how much
of the original design specification has
been exercised
• Specification as reference.
• List all functions as list of items
• Check that each item of list is
encountered.
• Goal : 100% Functional Coverage
56
DrUshaMehta02-08-2019
58. Bug Tracking System (BTS)
• When a bug found by verification
engineer, it is reported ( logged) into BTS
• It sends notification to designer
• Stages:
• Open
• When it is filed
• Verified
• When designer confirms that it is bug!
• Fixed
• When it is removed from design
• Closed
• When everything else works fine with new
58
DrUshaMehta02-08-2019
59. Regression and Revision
Control
• Regression
• Return to the normal state.
• New features + bug fixes are made available to the
team.
• Revision Control
• When multiple users accessing the same data,
data loss may result.
• e.g. trying to write to the same file simultaneously.
• Prevent multiple writes.
59
DrUshaMehta02-08-2019
60. Hardware Modeler
• You can buy IP for standard verification
• It is cheaper to buy than write them yourself
• Your model is not reliable as the one you buy
• What if you cannot find a model to buy?
60
DrUshaMehta02-08-2019
61. Verification Language
Hardware Description
Languages
• VHDL, Verilog
• concurrent mechanisms
for controlling traffic
streams to device input
ports, and for checking
outstanding transactions
at the output ports
• but not suitable for
building complex
verification environment
Software
Languages
• C, C++
• Suitable for building
complex environment
• but No built-in
constructs for modeling
hardware concepts such
as concurrency,
operating in simulation
time, or manipulating
vectors of various bit
widths.
61
DrUshaMehta02-08-2019
62. Hardware Verification
Languages
• Why Verification languages
• Raised the abstraction level hence productivity
• Can automate verification
• Commercial
• e from Verisity
• Openvera from Synopsys
• RAVE from Forte
• Public domain or open source
• System C from Cadence
• Jeda from Juniper Networks
62
DrUshaMehta02-08-2019
64. Cost of Verification
• What if your testbench itself is buggy?
• Should test bench be verified? How? 64
DrUshaMehta02-08-2019
Type I
False Negative
Bad Design
Good Design
Pass
Type II
False Positive
Fail
65. How to reduce verification
time and efforts?
• Verification is a bottleneck in project’s time-to-
profit goal so verification is the target of new
tools and methodology.
• All these tools and methodology attempts to
reduce verification efforts and time by
1. Parallelism of efforts
2. Higher abstraction level
3. Automation
• Some new concepts are
1. Design for verification
2. Verification of a Reusable Design
3. Verification Reuse (Verification IP –VIP)
65
DrUshaMehta02-08-2019
66. Parallelism of Efforts
• Additional resource applied effectively to
reduce the total verification efforts
• e.g. to dig a hole more workers armed with
shovels
• To be able to write – debug testbenches
parallel to each other as well as parallel to
design implementation.
66
DrUshaMehta02-08-2019
67. Higher Level of Abstraction
• Enables to work more efficiently without worrying
about low level details.
• Reduction in control
• Additional training to understand the abstraction
mechanism and how desired effect is produced.
• To work at transaction levels or bus cycle levels in
stead of dealing with ones and zeroes.
67
DrUshaMehta02-08-2019
68. Automation
• A machine completes the task autonomously
• Faster
• Predictable result
• It requires a well defined inputs and a standard process.
• When variety of work exists, automation is difficult.
• Variety of functions, interfaces, protocols and transformation
makes automation in verification difficult.
• Tools automates various parts of verification process but not
the complete process.
• Randomization of input generation is one way to automate
verification process.
68
DrUshaMehta02-08-2019
69. Design for Verification
• It is reasonable to require additional design
effort to simplify verification.
• Not only should the architect of the design
answer the question
“what is this supposed to do?”
• but also
“how is this thing going to be verified?’
• It includes:
• Well defined interfaces
• Clear separation of functions in relatively
independent units
• Providing additional software accessible
registers to control and observe internal
locations
69
DrUshaMehta02-08-2019
70. Verification Reuse
• Improving verification productivity is an
economic necessity. Verification reuse
directly addresses higher productivity
• If a bus functional model used to verify a
design block can be reused to verify the
system that uses that block.
• All components be built and packaged
uniformly.
• Verification reuse has its challenges. At the
component level, to reuse the test cases or
test benches is a simpler task but to reuse
a test bench component at different
projects or between two different level of
70
DrUshaMehta02-08-2019
71. Verification of Reusable
Design
• It is proven that design reuse is more
problematic because “ Reuse is about
trust”.
• Functional verification matrix can only give
that trust to design reuser.
• The reusable design should be verified to a
greater degree of confidence than custom
designs
• Reusable designs need to be verified for all
future possible configuration and possible
uses
71
DrUshaMehta02-08-2019
72. Some Terminology….
• When is testing performed?
• As a separate activity – off line testing
• Concurrent with normal system operation- on line testing
• Where is the source of stimuli?
• Within the system itself – self testing
• Applied by an external device/tester – external testing
• What do we test for?
• Design Errors – Verification
• Fabrication Errors – Acceptance Testing
• Fabrication Defects- Burn In
• In fancy Physical Failure – Quality Assurance Testing
• Physical Failures – Field Testing/ Maintenance Testing
72
DrUshaMehta02-08-2019
73. Terminology……
• How are the stimuli and expected response
produced?
• Received from storage-Stored pattern testing
• Generated during testing – algorithmic testing ( stimuli),
comparison testing (response)
• How are the stimuli applied?
• In a fixed order
• Depending upon the result obtained so far – adaptive
testing
• How fast are the stimuli applied?
• Much slower than the normal speed – DC/Static testing
• At normal operating speed – AC / At speed testing
73
DrUshaMehta02-08-2019
74. Some terminology….
• What are the observed results?
• The entire output pattern
• Some function of output pattern – compact/signature
testing
• Which lines are accessible for testing?
• Only the I/O lines –edge pin testing
• I/O and Internal Lines – Guided Probe testing, Bed of nails
testing, electron beam testing, In circuit emulation, in-
circuit testing ( tester will automatically isolate the IC
already mounted on board.
• Who checks the results?
• The system itself – Self testing/checking
• An external device/checker – External testing
74
DrUshaMehta02-08-2019