SlideShare a Scribd company logo
1 of 20
EE295 Design Verification Ishwaki Thakkar
010734451
1
Design Verification
Abstract
Imagine that the cell phone, which is multitasking, is capturing some video and suddenly the phone rings,
mobile getscrashed, anddeletes the video captured. Thedesign has theflaw and proper testing of functionality
did not take place. There is a need for verification in a design. The set of all possible Test vectors tested on any
design checks thefunctionality of each specification. For example, one application should not crash or face any
problems when other situations come up. Verification is when Test inputs are given to check how the digital
design or system works and to compare if it works as expected. So in an above case, the mobile should store
the video captured if a phone call interrupts. The verification engineer has to go through various test input
vectors to check functionality and robustness in a circuit. Hence, the need for verification comes into the
picture when the system is still in a design process. This Paper provides the reader the familiarity to the
Verification Process, how it is different from Testing, how many times the verification takes place in any design,
where does it lie in a design flow, types of verification methodologies and its significance. The targeted
audience is the students who are pursuing Digital Design course and the one who are aiming to have a basic
understanding regarding verification process.
Verification is not Testing. Verification is the phase before the silicon fabrication while testing is the phase
after the making of silicon, i.e. post silicon phase. Verification is to assure the product is as per specification
before the actual process of making of silicon and Testing is to assure the product is as per specification after
the making of silicon. So there lies a huge difference between two similar looking words.
The verification has special importance on which the economics of the whole company and success of design
rely. Suppose any company is designing its own new device like a mobile phone, spending billions on various
costs such as designing, prototype production, ASIC vendors’ services, the engineering cost, marketing cost,
after the building of device and when tested it doesn’t work as expected, all billion dollars are a total waste.
On a different note, verification engineers built simulator models that simulate the behavior of a system, the
results when checked with the Hardware language like RTL results equalize with the system's behavior the
system is almost perfect. The following paper is about the significance of Verification of design, where does it
stay in VLSI design flow, why verification engineers over take the demand of design engineers, how verification
takes place and various methodologies of verification existence of various like logic verification, functional
verification, formal verification, UVM, etc.
To understand the basic of verification of any design user should understand the depth of intention and how
the verification takes place. Any electronic circuit is in closed form where electrons can travel. A digital circuit
is the one with discrete voltage level 0 and 1 where 0 signifies 0 volts and 1 varies according to technologies
on 5V,3.3V,1.8V, if voltage levels are constant, it is an analog circuit and not the digital circuit. In any digital
design on base of specification, the design of a circuit or module takes place. For example, there is a 2-bit full
adder circuit to check its functionality, the sum is a (or) b and the carry is a (and) b. So by giving various test
inputs like 0 0, 0 1, 1 0, 1 1; the user can verify functionality of a sum and carryif the result is equal to expected
output. The 2-bit adder here is a too small circuit, the real large scale system like multiple core processors,
EE295 Design Verification Ishwaki Thakkar
010734451
2
hardware embedded in a car, millions of transistors and millions of gates fabricates in mobile processors.
Hence, the technique to check the complete functionality of all the Hardware when various test inputs are
given to see if circuit as a component, a module and as a whole system is working as expected or not is the
main agenda of verification.
Manually when the test inputs patterns given to check output it is possible to verify small circuit such as the
adder by checking the operations of, but the real time designs are very complex highly integrated application
like processors. So manually checking the systems consisting millions of gatesand hundreds of inputs is difficult
to do manually as n inputs requires all possible combinations which is 2^n and cantake millions of years totest
the circuit. Thus, verification requires automation like CAD tools which automate design, when the user gives
specifications, the tool gives output as a final circuit and some algorithms about what patterns to apply at
hardware level help to prove mathematically if the design is obeying the design specifications as desired or
not. In summary, some automated tools and methodologies exist as the approach of how to verify the circuit.
Verification is not testing phase, but a design phase because Testers take system like a black box and there is
no need to understand the internal structure of a system to perform Testing. In contrast, the Verification
engineer needs to understand all the internal structure of a chip the way designers have designed, model the
design according to its behavior, develop the test bench covering test cases such that the whole functionality
of a chip along with each and every line of code is checked and passes through all the possible test vector’s
possibility and combinations. I order to proceed further, User should know basic VLSI or ASIC design flow to
understand the verification process and where does in lies in design flow.
ASIC Design Flow [1]
EE295 Design Verification Ishwaki Thakkar
010734451
3
Specifications
The first step of a design phase is the specification which gives information about basic specifications of design
after which implementation begins. The ASIC designer gets the specification from the customer, the
specification may be power, chip area or the speed. The top level management decides the micro-architecture
or sample architecture. If the design requirements are to build a full adder, the engineer decides top model
units such as it requires two half adders, some standard cells. The estimation is done as per design
requirements of an area, power and components required are analyzed and returned the analysis to the
customer showing the results and the cost and then the design implementation phase begins.
RTL Design
When the engineer gets all the specifications, the design phase begins. Just like the microprocessor which
consists of various blocks such as FPUs, Normal Arithmetic units, cache memories, inbuilt memories. The
project lead assigns different modules to lower level engineers. The work is divided into different groups. Each
group is building its own RTL level design. This is the first stage of Technology independent work. Once the
design of RTL completes, the succeeding phase is simulation.
Simulation
Once the RTL is designed it is necessary to verify the functionality which needs Simulation. Simulation creates
a model which finds out the control that shows the whole behavior of a system. It informs about how it will
affect to the surrounding modules which shows if there is need of any change to work for the correct
synchronization of all units together in a module or all modules in a system. There are two versions of RTL:
Synthesizable and Non-synthesizable. It is advisable to write Synthesizable code. In short, simulation is an
abstract representation which predicts the future behavior of a system.
Synthesis
After the simulation, Test vectors verify the functionality in a Synthesis process. Up to this phase of design, it
was Technology Independent design. After this phase, the output resulted in this stage is Technology
Dependent design. Synthesis is a three stage process: Translation, Optimization, Mapping to the Particular
Technology (32nm,45nm etc.). In synthesis, the tool converts the RTL design implementation usually in Verilog
or VHDL is turned into a gates, which forms vg file. Apart from that user needs to specify a constraint for the
design called Synopsys Design Constraints like input delay, output delay, clock period, whether the design is
leading to a multicycle path, Is there any false path, Latency and Clock Uncertainty. Design to design the
constraints may vary. This constraints, user specifies in the.sdc file. When the .vg file and .sdc file is ready the
next stage is Design For Testability.
Design for Testability
In DFT, anEngineer measures the controllability and measurability of a design. How far is the design achievable
in terms of error rate? What percentage of a design is coverable is measured in DFT? The file called .atpg is
formed. ATPG is Automated Test Pattern Generated file. Thus .vg, .sdc and .atpg files are formed. From
simulation to DFT user need to migrate from different tools and this is not possible with one simulation.
Industry standard tools for synthesis is Design Compiler from Synopsys. But considering Design For Test,
EE295 Design Verification Ishwaki Thakkar
010734451
4
Cadence is used. Up to this phase, the flow is Front-end ASIC Design. The semiconductor design process is
categorized into three phases: Front end design, Back end Design and Fabrication. Different designs with EDA
tools require some Data Preparation. The following phase after this is a back end.
The SoC Requirement for Back end
Place and Route and before entering to back-end user requires various data Preparation. The first input is Gate
level Verilog netlist .vg from the Design Compiler. The second input is .sdc Synopsis Design constraint which
synthesis tool generates. The third input to specify .Lib or library which specifies the technology user are
working. To work with 45 nm technology, the user needs the library of 45 nm. The library files are usually slow
lib, typical lib and fast lib having different PVTs i.e Power, voltage, and Temperature. Different libs have a
different number of components. In different libraries, components are same but PVTs are different. Along
with all this three input, the user requires one more information which is a library exchange format (lef) which
has metal lead information, wire information, width and height of a sink, the volume of a sink. Hence, we need
4 data: .vg, .sdc, .lib with three different formats and the lef before entering the phase of back-end.
Before proceeding further to the Back-end, every process is very important in IC design as it takes too much of
time and cost. Hence, it is necessary to check all the constraints and the initial stageto avoid mismatch. There
are questions that come up, like whether the RTL engineer gave proper netlist, are there any floating nets or
multi-cycle path nets. Such verification is done at the initial stage and there should be a unique file design
which ensures that the designer did not use the same component at any other place. Each module has an
instance name and should be a unique name, no copy of it should exist in a design. So check design verifies all
the constraints, giving information about what is an approximate area, approximate power, the time required.
When any information is wrong or has any problem user should inform the RTL engineer who will optimize and
resend the final design with updated .vg gate level netlist as well as .sdc. In conclusion, the design process is
very delicate and should ensure precise functioning between the front-end and back-end. Once the user
receives the new updated: vg and sdc time, which runs perfectly the trailing phase is Designing Phase. For
understanding, thetimedesign, user should have knowledge about the various kinds of timepaths. Timedesign
invokes the timing engine which classifies the design in a 4 different timing path.
Timing Paths [2]
Point to point possible paths in a design from where data propagates from one flip-flop to another. Each path
has its own start and end points on which it is classified in various path groups, on the basis of which STA
creates a report. There are 4 types of Data Path: 1)I/P to Reg (Data path), 2) Reg to Reg (typical Path), 3) Reg
to O/p(Data path) 4 ) A Pure Combinational path.
(1)
EE295 Design Verification Ishwaki Thakkar
010734451
5
(2)
(3)
(4)
When all the values of 4 paths have positive slack it corresponds that there is no bug or problems with the
timings. Thus, in an initial phase one has togo through check design and time design. The timing will be verified
at 5 different stages which lead to the addition or deletion of components: Pre-place, Pre-CTS, Post CTS, Post
Route, Post Silicon. Whenever there is any change in a design, while adding each new component timing is
performed so that there is no negative slack and setup or hold violation. When the user confirms the timing of
a Pre-Place phase of Floor planning takes place.
Floor planning
As before buying a house, one checks the architecture and rooms to see how will they fit in their beds, cars,
kitchen equipment, and accessories. Similarly, there is some specified silicon and hard macros like RAM, ROM,
PLL and soft macros. The information about a number of analog and digital blocks, its placement and
identification are floor planning. So Process of identifying the right place for theright component is called floor
planning.
The designer should keep in notice some constraints while floor planning. Suppose while placing the macros,
e.g PLL has 600 connections and 100 interconnections which talks with other module X. Hence, placing X
close to PLL is important so an interconnect delay is less and timing is met. This is a significant step to take
care. In a floorplan, analog and digital parts should not be together. If the RAMis placed which has some IOs,
then placing the RAM very near to external IOs is important. The interconnects should be as less as possible.
Also, when the analog block connects to the memory it has some pre-defined constraints as to keep it 5
microns away from the analog component. Hence, it is necessary to fulfil all the requirements of a floorplan.
After the floor plan, power plan phase comes into the picture.
EE295 Design Verification Ishwaki Thakkar
010734451
6
Power Plan
Suppose there are 25k components, there are some power requirements for all components, and of a
particular technology like for 180 nm the Vdd can be 1.8V. If there are 20000 components are there in a chip
only ring of Vdd is enough for the chip, but if there are 1 million components in a chip one Vdd or one Vgs ring
is not enough to trigger the component. To overcome this problem, there are horizontal and vertical strips
inserted according to the complexity. The component instead of taking the power from Vdd and Vgs rings can
take it from the stripes nearer to them. Hence, every individual component should get the power constraints
as required.
Place Design
The step here is the realplacing of a design for a chip which is the nest stage after power plan. A Standard cell,
macros or the modules from various vendors areplaced onto the chip. This is called hard placement, classified
as time-based or congestion-based placement. If the time is concerned an engineer applies the time-based
model and if one is very much interested in an area, one should go for the congestion based model. After this
phase, anengineer againperforms Time Design, which is called Pre-CTS. Before Clock TreeSynthesis once again
the user checks the timings and if the slack is positive, the CTS or clock tree synthesis phase takes place.
CTS
The phase in which the main aim is to obtain 0 skew. For this, one should know what is skew. [3]
The clock tree has buffers and the clock edge reach the launch flop through the buffers with a set of wires,
the combinational delay requirement also changes when the clock is taken into consideration. The clock
which was supposed to reach at 0 ns will now reach at a buffer 1 + buffer 2 delays.
1= Buffer1+Buffer2 delay
EE295 Design Verification Ishwaki Thakkar
010734451
7
2= Buffer1+Buffer3+Buffer4 delay
So a clock reaching at t=0 will reach at t=0+1 similarly clock generation at T ns will be now t=T+2Delay. So the
previous clock cycle was 0 to T is now from 1 to T+2. Hence, the combinational delay requirement modifies
equation to
(T + 2) - Ts - Tsu> Tcomb + 1
The difference or spatial variation in a temporarily equivalent clock edge is called skew.
(T+2)-Ts-Tsu >Tcomb+1
The CTS aims to optimize the skew and tries to get all clocks at the same time by either inserting delay or
reducing delay. Hence, additional clock buffers and clock inverters optimize the clock path. Again, after
adding additional elements, verification of timing is done, which is Post CTS. Once timing matches with
positive slack the next step is Routing.
Routing
EE295 Design Verification Ishwaki Thakkar
010734451
8
There are two kinds of Routing. One is global routing and one is detailed routing. Global Routing is When
there are two locks in a design and there are multiple ways two blocks are connected and the most optimized
way to connect the two modules is global routing. According to the details from the global routing, detail
routing is performed. On the base of algorithms decide the best-optimized path with interconnects based on
which detailed routers will place three nets 1) Signal nets 2) Power Nets 3) clock nets. Once these nets are
routed, the routing is completed. Checking the timings again after the routing is called Post-Route static
timing analysis checks setup and hold violations. If the timings pass successfully, the timing reports are
passed to DRC engineers, who runs caliber tools test for VRC violations or net violations exist. If a net
violation exists this is passed through the signal integrity engineer.
Signal Integrity
There exist some problems which SI engineer solves. When the technology is less than 45 nm there exists the
problems like cross talks, electromagnetism, transmission of field, ringing, noise or migration. The SI check for
all such problems. After the Signal Integrityis performed, again performing the Static Timing Analysis is called
Post SI timing verification. To avoid all these internal effects like cross talks, shields are inserted, so again the
Post Timing analysis is performed. Once the user performs Post Timing SI there is a generation of tapeout and
sent to the fabrication for manufacturing, which is the last step in a design flow.
The Significance of Verification [2]
Process of demonstrating functional correctness of design is called Verification. A Way in which user can
measure implemented design is functionally correct with the design specification. It is necessary to verify the
design. There are certainquestions which are skeptical to the guarantee of design without verification. If
there are incorrect specifications or insufficient specifications, the design will be interpreted wrongly,
engineers can misinterpret or misunderstand the specification can lead to errors later on. The incorrect
interaction between IPs and cores, especially when the design involves multiple IPs and cores when different
engineer, design different IP when each engineer interprets differently and when all the IP interact with each
other, it can lead to complications. If a user does not verify the result, it can lead to the unexpected behavior
of a system. Hence, to avoid all the problems to ensure the functionality is correct in a system and if there is
no verification all the hurdles can never lead to the functional expected design of specification.
Any bug that escapes the design process to the active silicon of a chip design can be very costly and can result
to the race pin of a chip. Verifying the design consumes 70% of a design cycle is spent. As the complexity in a
designs keep on increasing day by day it is very hard to make sure that the design is functionally correct to all
the specification before the actual silicon is made. Hence, verification is always on the critical path for any
product design.
EE295 Design Verification Ishwaki Thakkar
010734451
9
Verification classification
There are mainly three kinds of verification. The first is the Functional Verification which is a Process which
focus more on making sure if the functionality satisfies the design specification. There are certain checks like
what are the protocols implemented and verifying features that design supports. Secondly, the Timing
Verification Verifies if the actualtiming implementation of design meets with positive slack and behaves as the
expected frequency. If there is any change in a design, there is a need of repeating the timing verification to
meet timings as seen previously. Performing the Pre-CTS, Post CTS, Pre-SI, etc. is hence important. Thirdly, The
Performance Verification is a Process in which focus is on the actual performance goal of a design. E.g.
microprocessor design expects n instruction in a cycle, hence, verifying successful execution of expected
number of n instruction falls under Performance verification. If the design of memory is capable of read and
write every cycle, then the actual cycle should be able to read and write every cycle.
Methods of verification
In order to verify the designs there are various ways to do verification. They are as follows; Simulation based
verification, Emulation/FPGA based Verification, Formal Verification, Semi- Formal verification, HW/SW Co-
Verification
Verification: Planning, Matrix, Approaches [2]
Verification Plan
The first step is there is a specification document for verification effort is a document that captures planning
that the user executes to perform verification, which ensures to verify all the essential features. The document
has all the significant features and information to verify under different conditions along with the details.
Whether if theverification of features is an independent manner or in a combined manner in stress conditions.
The document also captures the information about how to verify all the features and capture the details on
using various Methodologies like Simulation based verification is used or other, formal verification, what are
checking methods to be used, are there need to apply coverage or not. The plan also shows what kind of
simulation an engineer should apply for verification, implementation of what kind of checkers are there and
type of coverage. The plan also has priority for verifying features. When the complexity of design is increasing
day by day with the increase in a number of features there exists a priority of one feature over another as to
EE295 Design Verification Ishwaki Thakkar
010734451
10
which features should be importantly active and which can be given lower priority. Hencethe verification plan
is a guide to the engineer to perform the method having all the important information.
Verification Approaches
The verification has three approaches. First is a Black Box Verification process where the engineer does not
need to have all the knowledge of design Implementation. The only information engineer should know is the
features that the design supports. The design is treatedlike a black box during verification. There are pros and
cons of this approach. The advantage is the tests are independent of implementation while the major
disadvantage of a black box where an engineer doesn’t care about the design Implementation hence there is
the lack of visibility and observability. In today’s world where there is high complexity the engineer deals with,
it is impractical to apply the black box approach in verification where there is no need to have complete
knowledge regarding design implementation. Second is White Box approach where there is a good and deep
understanding of design implementation, which has the advantage of full visibility and observability. This has
the advantage of an engineer to write a better test case and the stimulus to cover every feature and
functionality of a design. During debug, the good observability of engineer helps in debug process. The
disadvantage of a white box approach tests is bound to a specific implementation and limits the reuse of test
across different project when the implementation changes. Hence, the white box testing can be a good
verification approach with increased observability and visibility in comparison to the black box, but limits the
reusability of a test case. Third is Grey Box approach is a grey approach which is a compromise between both
the approach black and white box. The engineer doesn’t need to know everything about design in depth but
there are some features where the verification engineer needs to know. So the grey box is a combination of
above two approaches for the specific implementation which gives advantages of a white box of visibility and
observability and the advantage of a black box of reusability. Hence, the three approaches have its vivid
functionality in different applications.
EE295 Design Verification Ishwaki Thakkar
010734451
11
The Level of Verification
The diagram shows how the design is partitioned during the product design phase. Every design is
partitioned into several systems, sub system, core IP block units and subunits. Design start building up from a
bottom up approach. Most sub units and the units needed for the overall design to work in parallel. Once the
design unit are in place are put together under core blocks and then the core blocks are put together to form
sub system built to form system on chip which in turn makes system which makes a board on chip assembled.
As the design is partitioned, verification is also partitioned at each multiple levels. Each level of verification is
suited for a specific objective. Lower level of verification is the verification of unit or sub unit or core IP blocks
in a stand-alone environment. In a unit or subunit level there is no need to build a proper test bench or
verification environment, Ad-Hoc basic operations can work like forcing certain input signals and checking the
design output behavior as per the specification. The more verification an engineer does at lower level; more
arethe chances to find the most bugs in parallel to thedesign which saves the time in early level of verification.
Multiple verification of such units built in a system or subsystem to the product level verification takes place
along with design development starting from beginning of design and focusing on different levels of
verification.
Verification Matrix
The verification matrix is the matrix that helps the engineer tracks towards the completion of a plan and the
quality of verification. Some matrix of them are; first, Measurements Matrix: Features to verify, a stimulus to
verify, how many tests are being written, stimulator generator is written, passed tests rate, failure rates. The
EE295 Design Verification Ishwaki Thakkar
010734451
12
matrixhelps engineer tracktowards the actualpresent situation to what was planned for thesituation. Second,
Historical trends Matrix: This matrix grows as the number of projects increases. The matrix of a project
regarding the past can help the user determine how the present project is trending which can help compare
the quality of a current project to the past if it is better or worse in comparison to the past project.
Verification Methods [2]
Some verification methods that users most commonly used are
Simulation Based Verification
The diagram shows the design Implementation, where a software simulator surrounds the design. This is most
commonly used design verification. A Test vector generator that generatesan input stimulus to the design and
the design under the test is simulated using the simulator. There is a golden reference model, a scoreboard or
a checker to generatethe output treatedas golden output, which compare with the actualoutput coming from
the design implementation to ensure the correctness of a design. Based on complexity of design
implementation the test vector generation can be simple as the directed test generator or as complex as the
random reconstrained test generator. Approaches like Assertion and coverage matrix ensure if verification is
complete or not.
Formal Verification
Formal verification is the method through which user proves or disproves a design implementation against a
formal specification or property. This method uses mathematical reasoning or algorithms to prove a design
against the formal specification. It exhaustively explores all possible input values over time, exercises various
state spaces of a design. This verification method works well when the designs are small and the number of
inputs, outputs and states are small. With the increasing design states, the formal verification may not be that
effective.
The diagram shows how the equivalence checking is done. RTLA/ GatesA is a reference or a golden design,
during implementation the design is RTLB/ GatesB which is a target design which is a modified version for a
specific purpose or with different library, equivalence checking of A and B are done to check if they are
functionally equivalent. This doesn’t guarantee both are correct with respect to each other, but guarantees
two kinds of implementation remain functionally equivalent.This is another kind of implementation where
another reference is checked for RTLC/ GatesC to check the equivalence of a design, here design C and B are
checked.
EE295 Design Verification Ishwaki Thakkar
010734451
13
Model checking is an another way of Formal Verification, which is commonly used is Model checking. It is a
way which exhaustively search FSM for stateor property violation for nstate machine. This model is used when
the design is mostly based on a Finite Statemachine. In this approach, there are two input to algorithm one of
which is a finite-state transition system representing the implementation and other input is a formal property
representing the specification. The engineer gives these inputs to the algorithm which exhaustively explores
all the input combinations, which can violate the FSM model if it doesn’t find any input value combination that
violates the finite state machine model then the system is formally verified.
This diagram explains the model checking. The FSM model is shown and the formal property is the specification
of a same design which is input to the model checking model which has formal verification tools. The tool
algorithmically explores all thepossible input combinations for the difference in space and the design and come
up with the counter example and combination that can fail the design. If the model doesn’t fail for any
combination it is formally verified.
There are many advantages of Formal Verification like it Covers exhaustive state space which is hard in a
simulation where the user manually creates a test. There is no need to generate any input stimulus since tools
automatically generate exhaustive stimulus. There is no need to generate expected output sequences like
EE295 Design Verification Ishwaki Thakkar
010734451
14
checkers, a reference, and a golden output. There is a guaranteeof Correctness of a design mathematically. It
has advantages but it has the disadvantages too.
The disadvantages in a formal verification method are that they are not scalable for lots of designs. With each
addition of flip-flop, the state space gets doubled. Verification tools have to exhaustively cover more
combination hence, it is limited to a design with small state space like the interface and small FSMs. Scalability
is not obtained in Larger design. For every specification, theuser needs to translatethespecifications into some
kind of property specification language, which is difficult as state space keeps increasing. To verify the
properties holds true under all the input and state condition is difficult.
Semi-Formal Verification
The semi-formal verification is best of both simulation and formal verification. Design with a larger specification
where the formal verification cannot rely alone, the simulation method of verification is used to cover critical
parts of design initially after which the user does exhaustive formal verification around those critical design to
cover statespace. This approach helps to cover larger design state space faster and helps to find bugs deep in
a design as a result of formal verification around the critical design hence useful for bug hunting.
Assertion
Assertion is a statement about a design’s intended behavior which must be verified. A way of capturing design
intention or a part of a design specification in a form of property which can be used in dynamic simulation or
formal verification to make sure that specific intention is met or not met.
There are many advantages of assertion based verification. It has improved Observability and debug ability
because the design intention is captured in a property and user know when the specific property came across
during debug. The correct usage checking improved integration, different people build multiple units, during
integration of this unit into top level design, each assertion for the unit helps to ensure that at the integration
level design still works properly. Improves verification efficiency, lots of time spent in debug or testing can be
saved. Improves communication as it is a way of capturing in documentation. Used in both static verification
as well as dynamic simulation.
There aretwo types of Assertion one is Immediate Assertion, which at any given time user creates assertion or
property. E.g. Assert (A==B) else $error(“Wrong”); At a given point of time if A and B are not equal, then it
violates the design intention, and it gives the error. Thesecond type of assertion is Concurrent Assertion, which
creates an assertion which can occur over the span of time, hence that happens at every clock. E.g.
property p1;
@(posedge clk) disable iff(Reset) not b ##1c;
endproperty
assert property (p1) else $error (“B ##1C failed”);
It shows property that every pos edge of clock disables if reset is not asserted and b is not 1 for one cycle
than c has to be 1 for one cycle if this does not hold true, it is a failure.
EE295 Design Verification Ishwaki Thakkar
010734451
15
Assertion Based Verification
The assertion based verification is a method of verification where an assertion is used in Formal or Dynamic
verification. In formal verification it specifies all thefunctional requirements and the behavior of a design. User
can write all the formal properties using assertion. This method uses a procedure to prove that all the
specification properties are correct. User can also use the assertion in dynamic simulation. Checkers doing
simulation and specified properties can be the condition of coverage, hence the user can verify dynamic
simulation conditions are met. Hence, the user can use Assertion based verification in both formal verification
as well as dynamic simulation. In summary, simulation based verification is most commonly used to find bugs.
User can use Formal Verification in selected areas like Finite state designs specially when the states are small.
The Assertions are of significance to both design as simulation based design and formal verification.
Directed verification vs Random Verification [5]
There aretwo ways in which verification can be performed. First, DirectedVerification is the approach of a test
created for directed feature or every feature in a design. If there are multiple features that have to operate in
a different condition, the user has to create targeted feature each has to be operated on different condition.
This approach works well when the design is in an early stage of development and is not mature when the
design scope is limited, and it is possible to check each and every possible condition of a design. Second,
Random Verification is an approach, where a test is not created for every feature, but the test is generated for
multiple tests in a random test or random generatorthat createsmultiple scenarios in a random manner. Both
the approach has advantages and disadvantages. If the design complexity is high and the design is mature
during later stages of design development, the random approach of verification is useful.
The directed Verification can only cover scenarios through thought planning, so during the verification
planning stage whatever the scenario designer thinks through a test it and only the features which user can
test which user think. This approach has high maintenance cost. The directed approach works well when the
condition state is finite and can plan for all the test scenarios. When the design stage is huge, it is not
possible to cover all the scenarios in a given time. The good part is no need for extensive coverage coding as
it guarantees the test hits all the scenarios so no effort is needed to check if all scenarios are hit.
The pure random approach depends on the random test generator that can randomly hit all the scenarios. In
order to make sure all the conditioned space is covered, it is necessary to run in an infinite number of cycles to
ensure all state conditions is covered. The good part in this approach, there is a less user control and a user
depends more on the generator which generates random scenarios.
In order not to let the generator be purely random, there is an approach which lies in between this two
approach which is Constrained Random Verification which has constrained randomness tothe criticalscenarios
to the design. This approach has the high ramp up time so the user has to understand design scenarios and
code random intelligent generators. To measure constrained randomness and completeness user needs to
measure coverage matrix. The good part is the approach gives deep control which can hit all different state
space in a design. The approach is the best balance between engineer timeand thecomputing time, user don’t
have to run too many tests and depend on too many randomness’s as it is constrained the scenarios can hit
faster. This can be static or dynamic randomness which means the either built random test upfront or built
intelligence in a generator dynamically creates randomness to hit more scenarios. Hereconstrained random is
considered best for complex designs as it is not possible to create all the test scenarios and user can also not
rely on pure randomness in generator to get everything out of luck.
EE295 Design Verification Ishwaki Thakkar
010734451
16
The first diagram show percentage of functional coverage hit versus the time taken. The directed testing
approach with every test case, the user can hit certain functional coverage, hence it is a step improvement till
100% coverage to hit. Whereas the constrained random approach has lots of effort to put to build the test
generator. When a user may not be able to really test anything, hence it has an initial ramp up time, but once
the engineer generates initial test, the user achieves quick jump. But, again, there are cases where again user
doesn’t get certain scenarios which, when covered again gets a quick jump, hence this follows when the
scenarios are not covered and then the quick jump. This approach can reach 100% coverage faster than the
directed approach.
The second graphs also show the comparison of an engineer time versus compute cycle time spent for all the
three approaches. In a process of a directed test, initially a user builds each test for every scenario, and as the
time goes user builds bigger tests and hence user spends a lot of time and compute the cycle. In a completely
random test, a lot of engineer time goes into coding coverages and building randomness which runs through
billion cycles. Whereas directed random approach is a balance between both, where initially engineer spends
lots of time to come up with constraint and coverage and then use compute cycle effectively to get better
coverage. Hence, the directed random approach has relatively low engineering cost as well as compute cost
Coverage
The Metric or completeness of verification is called coverage. The reason to do coverage is that it is most
commonly needed for complex designs. User does verification based on samples and not on the extensive
verification of complete space. If there are n space it is almost impossible to run 2n possible tests to cover full
space which will take n million years to complete entire verification. Hence user usually follows constrained
EE295 Design Verification Ishwaki Thakkar
010734451
17
random approach where user creates test around critical scenarios. User needs to know exactly what areas of
design needs verification. There are two types of coverage which helps identify what kind of design area are
verified well.
Code Coverage
This coverage gives matrix of analysis of how good the code is covered during code simulation. There are
various code coverages like Statement coverageis a matrix shows how much source code user executes during
different tests. Branch coverage shows weather every control structure evaluate to true or false which are
branch condition. Examples are If, case, forever, for, while, loop, case statements; test covers branching
conditions or not. Condition coverage gives information of coverage of weather every test covers Boolean sub-
expression to true or false. Expression coverage gives the information of RHS of an assignment like x<= (not a)
xor b; the coveragewill give information of all combination of xor and not are covered or not. Toggle coverage
covers logic node transition report. Standard toggle coverage gives information on every node transit from 0
to 1 and 1 to 0 while extended togglecoverage gives information of all the four stages of 0<->1, z<->1 and z<-
>0 transitions. FSM coverage gives coverage on different state transitions and different arc coverage.
Functional Coverage
Functional coverage is different than the code coverage it covers the functionality of a design under test
directed from thedesign specification and tools cannot generatean automatic functional coverageunlike code
coverage. User need to createfunctional coverage monitor which the tool uses to extract coverageduring the
simulation. This checks if all the possible DUT input operations are injected, if all the possible DUT output or
functional responses seen on every output port. Internal DUT coverageof all interested design events arebeing
verified like FIFO are full or not, DUT covers arbitration mechanisms.
The diagramshows the coverage driven verification. From a verification plan a set of matrixuser createswhich
inform about what scenarios are to be covered and then generate random tests which user runs and collect
report to ensure the coveragegoals. If it meets goals theverification is complete, if not then user identify what
constraints didn’t meet coverage and generatenew tests or re enhance test generators in order to hit all those
scenarios, coverage matrix can also be enhanced to get complete verification flow this process is repeated to
get complete verifications.
EE295 Design Verification Ishwaki Thakkar
010734451
18
Latest Trends of verification [6]
Hardware Accelerated Simulation/ Emulation
An alternative method used specially at the higher level of verification like a system or full chip verification.
This is because simulation based verification tends to slow down as the size of design increases hence when
the full chip or SoC where the Design size are huge the simulation runs very slow and there is need to use
alternative method. Simulators is still a simulator that runs at a host CPU which emulates the signal at every
cycle or every event change in signal. At higher level as alternative, user try to move the time consuming part
to the hardware. As the diagram shows the set up modules representing the design implementation and the
test bench comprises all the behavior components like stimulus generators or the checkers. The most time
consuming part of a simulation model is design implementation in terms of an evaluation of modules at every
event change or signal change or every cycle. In this model the most time consuming modules and synthesize
compile to real hardware FPGA which runs much faster on real components while still the test bench
component continues to run on the Host CPU, then device some ways of communicating between the test
bench components hitting on the simulator talking to the design along with the actual hardware
communicating to the design. This approach gains the speed of minimum 10x to 20x.
EE295 Design Verification Ishwaki Thakkar
010734451
19
Challenges
There are many challenges of this method. It Improves the speed by moving hardware components to a real
system like FPGA, but degrades the communication of HW-SW. If there is a communication of Hw and Sw at
every event change and every event this model may not be the benefiting the speed. Abstracting HW-SW
communication is at transaction level rather than at every clock cycle and that is how the communication is no
more at every cycle, but at the transition boundary which helps increasing speed. This are some challenges
which has to be overcome.
Hence, to overcome these challenges, Hardware emulation is used. HardwareEmulation is a full mapping of a
hardware which can be the array of FPGA which is like a real target system which helps in speed up to 1000x.
There are several approaches for full hardware emulation like synthesizing the test bench if it is totally
synthesizable on to the emulator or setup configuration which looks like the real hardware system and to use
on a real stimulus generatoror real silicon testing. But in either case debug is still a challenge and may not gain
full visibility on the internals of design. Usually, HW or SW co-verification or together is mostly in use.
HW/SW Co-verification
SOC design involves hardware development phase or software development phase. At a high-level user come
up to a system specification as to what the system has to do which during the system design has to classify
between what hardwarehas to do and what the software has todo. On one side for thehardware development
where on the basis of specification, an engineer implements the actual design in RTL to the gates. While on
other side software development based on specification like what has togo todifferent levels like OS and driver
software and user develop a software code. This two have to be working on the real system. In both phases,
hardware verification involves functional and gate level verification to ensure that the hardware is working
correctly, in parallel software verification checks if it works stand alone. Traditionally user never checks if both
hardware and software work correctly together before going to actual silicon. When the design complexity is
increasing day by day it is becoming important to ensure if the hardware and software work together on the
actual silicon to avoid silicon bugs. Hence, the HW/SW verification is becoming more and more important. CO-
verification helps project complete in a short time and gain higher confidence in both HW and SW quality
testing.
EE295 Design Verification Ishwaki Thakkar
010734451
20
Conclusion
In this whole paper user learnt what is verification and why to do verification in SOC or chip design flow.
Verification is a process by which user measures the functional correctness of a design and if it is not ensured
that design is functionally correct it can lead to bugs in a silicon like race pins costing millions of dollars. Also,
verification is becoming the most significant part of a design flow. Where the verification lies in the design flow
like Pre-CTS, Post-CTS, Pre-Si, whenever thereis a changein a design. Verification goes parallel with the design
phase at unit, sub-unit, system level. Different Methods like Simulation/ Formal / Assertion-based verification
is there. There was a comparison between Directed Vs Random Verification methods and how there is a way
out with advantages of both the approaches as a constrained Random Verification. The reader also saw what
is coverage and why the coverage is important in a random or constrained-random testing. The latest trends
of Emulations like Hardware accelerated Emulations to check how the system behaves in real environment,
and HW/SW co-verification and its significance for the proper synchronization and working of both hardware
and software together. Hope that gave the reader basic verification concepts.
References
[1]https://www.youtube.com/watch?v=Y2PQzc9Gqsw
[2] "SOC Verification Using SystemVerilog - Udemy." Udemy. Web. 10 May 2016.
<https://www.udemy.com/soc-verification-systemverilog>.
[3] EE-287 Paper of How to time logic- Ishwaki Thakkar

More Related Content

What's hot

6 verification tools
6 verification tools6 verification tools
6 verification toolsUsha Mehta
 
Uvm presentation dac2011_final
Uvm presentation dac2011_finalUvm presentation dac2011_final
Uvm presentation dac2011_finalsean chen
 
How to create SystemVerilog verification environment?
How to create SystemVerilog verification environment?How to create SystemVerilog verification environment?
How to create SystemVerilog verification environment?Sameh El-Ashry
 
The Verification Methodology Landscape
The Verification Methodology LandscapeThe Verification Methodology Landscape
The Verification Methodology LandscapeDVClub
 
SOC Verification using SystemVerilog
SOC Verification using SystemVerilog SOC Verification using SystemVerilog
SOC Verification using SystemVerilog Ramdas Mozhikunnath
 
Challenges in Using UVM at SoC Level
Challenges in Using UVM at SoC LevelChallenges in Using UVM at SoC Level
Challenges in Using UVM at SoC LevelDVClub
 
Verification flow and_planning_vlsi_design
Verification flow and_planning_vlsi_designVerification flow and_planning_vlsi_design
Verification flow and_planning_vlsi_designUsha Mehta
 
Session 6 sv_randomization
Session 6 sv_randomizationSession 6 sv_randomization
Session 6 sv_randomizationNirav Desai
 
verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020Sameh El-Ashry
 
Verification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathVerification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathRamdas Mozhikunnath
 
Introduction of testing and verification of vlsi design
Introduction of testing and verification of vlsi designIntroduction of testing and verification of vlsi design
Introduction of testing and verification of vlsi designUsha Mehta
 
5 verification methods
5 verification methods5 verification methods
5 verification methodsUsha Mehta
 
Design-for-Test (Testing of VLSI Design)
Design-for-Test (Testing of VLSI Design)Design-for-Test (Testing of VLSI Design)
Design-for-Test (Testing of VLSI Design)Usha Mehta
 
System verilog important
System verilog importantSystem verilog important
System verilog importantelumalai7
 

What's hot (20)

6 verification tools
6 verification tools6 verification tools
6 verification tools
 
Uvm presentation dac2011_final
Uvm presentation dac2011_finalUvm presentation dac2011_final
Uvm presentation dac2011_final
 
system verilog
system verilogsystem verilog
system verilog
 
How to create SystemVerilog verification environment?
How to create SystemVerilog verification environment?How to create SystemVerilog verification environment?
How to create SystemVerilog verification environment?
 
Coverage and Introduction to UVM
Coverage and Introduction to UVMCoverage and Introduction to UVM
Coverage and Introduction to UVM
 
The Verification Methodology Landscape
The Verification Methodology LandscapeThe Verification Methodology Landscape
The Verification Methodology Landscape
 
Verification Challenges and Methodologies
Verification Challenges and MethodologiesVerification Challenges and Methodologies
Verification Challenges and Methodologies
 
SOC Verification using SystemVerilog
SOC Verification using SystemVerilog SOC Verification using SystemVerilog
SOC Verification using SystemVerilog
 
Challenges in Using UVM at SoC Level
Challenges in Using UVM at SoC LevelChallenges in Using UVM at SoC Level
Challenges in Using UVM at SoC Level
 
Verification flow and_planning_vlsi_design
Verification flow and_planning_vlsi_designVerification flow and_planning_vlsi_design
Verification flow and_planning_vlsi_design
 
Spyglass dft
Spyglass dftSpyglass dft
Spyglass dft
 
Session 6 sv_randomization
Session 6 sv_randomizationSession 6 sv_randomization
Session 6 sv_randomization
 
verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020
 
Verification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathVerification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career Path
 
Introduction of testing and verification of vlsi design
Introduction of testing and verification of vlsi designIntroduction of testing and verification of vlsi design
Introduction of testing and verification of vlsi design
 
5 verification methods
5 verification methods5 verification methods
5 verification methods
 
Lec13
Lec13Lec13
Lec13
 
Design-for-Test (Testing of VLSI Design)
Design-for-Test (Testing of VLSI Design)Design-for-Test (Testing of VLSI Design)
Design-for-Test (Testing of VLSI Design)
 
UVM TUTORIAL;
UVM TUTORIAL;UVM TUTORIAL;
UVM TUTORIAL;
 
System verilog important
System verilog importantSystem verilog important
System verilog important
 

Viewers also liked

Resume_VenkataRakeshGudipalli Master - Copy
Resume_VenkataRakeshGudipalli Master - CopyResume_VenkataRakeshGudipalli Master - Copy
Resume_VenkataRakeshGudipalli Master - CopyVenkata Rakesh Gudipalli
 
Vivek_resume
Vivek_resumeVivek_resume
Vivek_resumeVivek M
 
timing-analysis
 timing-analysis timing-analysis
timing-analysisVimal Raj
 
Online shopping report-6 month project
Online shopping report-6 month projectOnline shopping report-6 month project
Online shopping report-6 month projectGinne yoffe
 
Internship Report on Building Construction
Internship Report on Building ConstructionInternship Report on Building Construction
Internship Report on Building ConstructionEsmael Aragaw
 

Viewers also liked (7)

Shivani_Saklani
Shivani_SaklaniShivani_Saklani
Shivani_Saklani
 
Resume_VenkataRakeshGudipalli Master - Copy
Resume_VenkataRakeshGudipalli Master - CopyResume_VenkataRakeshGudipalli Master - Copy
Resume_VenkataRakeshGudipalli Master - Copy
 
Vivek_resume
Vivek_resumeVivek_resume
Vivek_resume
 
timing-analysis
 timing-analysis timing-analysis
timing-analysis
 
updatedElectrical
updatedElectricalupdatedElectrical
updatedElectrical
 
Online shopping report-6 month project
Online shopping report-6 month projectOnline shopping report-6 month project
Online shopping report-6 month project
 
Internship Report on Building Construction
Internship Report on Building ConstructionInternship Report on Building Construction
Internship Report on Building Construction
 

Similar to Design Verification

OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...IEEEGLOBALSOFTTECHNOLOGIES
 
System models of sdlc- v model
System models of sdlc- v modelSystem models of sdlc- v model
System models of sdlc- v modelMinal Kashyap
 
Embedded Systems Q and A M.Sc.(IT) PART II SEM III
Embedded Systems Q and A M.Sc.(IT) PART II SEM IIIEmbedded Systems Q and A M.Sc.(IT) PART II SEM III
Embedded Systems Q and A M.Sc.(IT) PART II SEM IIINi
 
V model Over View (Software Engineering)
V model Over View (Software Engineering) V model Over View (Software Engineering)
V model Over View (Software Engineering) Badar Rameez. CH.
 
V model Over view (Software Engineering)
V model Over view (Software Engineering)V model Over view (Software Engineering)
V model Over view (Software Engineering)Badar Rameez. CH.
 
Types of software life cycle model
Types of software life cycle model Types of software life cycle model
Types of software life cycle model Santhia RK
 
21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)ssuser7f90ae
 
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATION
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATIONA SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATION
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATIONIJCSES Journal
 
System analsis and design
System analsis and designSystem analsis and design
System analsis and designRizwan Kabir
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREVLSICS Design
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREVLSICS Design
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREVLSICS Design
 
List and describe various features of electronic systems.List and .pdf
List and describe various features of electronic systems.List and .pdfList and describe various features of electronic systems.List and .pdf
List and describe various features of electronic systems.List and .pdfinfo824691
 
Software Engineering Important Short Question for Exams
Software Engineering Important Short Question for ExamsSoftware Engineering Important Short Question for Exams
Software Engineering Important Short Question for ExamsMuhammadTalha436
 
iLumTech - Electronic design services
iLumTech - Electronic design servicesiLumTech - Electronic design services
iLumTech - Electronic design servicesiLumTech
 
Softwareenggineering lab manual
Softwareenggineering lab manualSoftwareenggineering lab manual
Softwareenggineering lab manualVivek Kumar Sinha
 
Flow of PCB Designing in the manufacturing process
Flow of PCB Designing in the manufacturing processFlow of PCB Designing in the manufacturing process
Flow of PCB Designing in the manufacturing processSharan kumar
 

Similar to Design Verification (20)

V model
V modelV model
V model
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...
DOTNET 2013 IEEE MOBILECOMPUTING PROJECT Model based analysis of wireless sys...
 
System models of sdlc- v model
System models of sdlc- v modelSystem models of sdlc- v model
System models of sdlc- v model
 
Embedded Systems Q and A M.Sc.(IT) PART II SEM III
Embedded Systems Q and A M.Sc.(IT) PART II SEM IIIEmbedded Systems Q and A M.Sc.(IT) PART II SEM III
Embedded Systems Q and A M.Sc.(IT) PART II SEM III
 
V model Over View (Software Engineering)
V model Over View (Software Engineering) V model Over View (Software Engineering)
V model Over View (Software Engineering)
 
V model Over view (Software Engineering)
V model Over view (Software Engineering)V model Over view (Software Engineering)
V model Over view (Software Engineering)
 
Types of software life cycle model
Types of software life cycle model Types of software life cycle model
Types of software life cycle model
 
21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)
 
SDLC Model
SDLC  ModelSDLC  Model
SDLC Model
 
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATION
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATIONA SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATION
A SURVEY OF VIRTUAL PROTOTYPING TECHNIQUES FOR SYSTEM DEVELOPMENT AND VALIDATION
 
System analsis and design
System analsis and designSystem analsis and design
System analsis and design
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
 
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
 
List and describe various features of electronic systems.List and .pdf
List and describe various features of electronic systems.List and .pdfList and describe various features of electronic systems.List and .pdf
List and describe various features of electronic systems.List and .pdf
 
Software Engineering Important Short Question for Exams
Software Engineering Important Short Question for ExamsSoftware Engineering Important Short Question for Exams
Software Engineering Important Short Question for Exams
 
iLumTech - Electronic design services
iLumTech - Electronic design servicesiLumTech - Electronic design services
iLumTech - Electronic design services
 
Softwareenggineering lab manual
Softwareenggineering lab manualSoftwareenggineering lab manual
Softwareenggineering lab manual
 
Flow of PCB Designing in the manufacturing process
Flow of PCB Designing in the manufacturing processFlow of PCB Designing in the manufacturing process
Flow of PCB Designing in the manufacturing process
 

Design Verification

  • 1. EE295 Design Verification Ishwaki Thakkar 010734451 1 Design Verification Abstract Imagine that the cell phone, which is multitasking, is capturing some video and suddenly the phone rings, mobile getscrashed, anddeletes the video captured. Thedesign has theflaw and proper testing of functionality did not take place. There is a need for verification in a design. The set of all possible Test vectors tested on any design checks thefunctionality of each specification. For example, one application should not crash or face any problems when other situations come up. Verification is when Test inputs are given to check how the digital design or system works and to compare if it works as expected. So in an above case, the mobile should store the video captured if a phone call interrupts. The verification engineer has to go through various test input vectors to check functionality and robustness in a circuit. Hence, the need for verification comes into the picture when the system is still in a design process. This Paper provides the reader the familiarity to the Verification Process, how it is different from Testing, how many times the verification takes place in any design, where does it lie in a design flow, types of verification methodologies and its significance. The targeted audience is the students who are pursuing Digital Design course and the one who are aiming to have a basic understanding regarding verification process. Verification is not Testing. Verification is the phase before the silicon fabrication while testing is the phase after the making of silicon, i.e. post silicon phase. Verification is to assure the product is as per specification before the actual process of making of silicon and Testing is to assure the product is as per specification after the making of silicon. So there lies a huge difference between two similar looking words. The verification has special importance on which the economics of the whole company and success of design rely. Suppose any company is designing its own new device like a mobile phone, spending billions on various costs such as designing, prototype production, ASIC vendors’ services, the engineering cost, marketing cost, after the building of device and when tested it doesn’t work as expected, all billion dollars are a total waste. On a different note, verification engineers built simulator models that simulate the behavior of a system, the results when checked with the Hardware language like RTL results equalize with the system's behavior the system is almost perfect. The following paper is about the significance of Verification of design, where does it stay in VLSI design flow, why verification engineers over take the demand of design engineers, how verification takes place and various methodologies of verification existence of various like logic verification, functional verification, formal verification, UVM, etc. To understand the basic of verification of any design user should understand the depth of intention and how the verification takes place. Any electronic circuit is in closed form where electrons can travel. A digital circuit is the one with discrete voltage level 0 and 1 where 0 signifies 0 volts and 1 varies according to technologies on 5V,3.3V,1.8V, if voltage levels are constant, it is an analog circuit and not the digital circuit. In any digital design on base of specification, the design of a circuit or module takes place. For example, there is a 2-bit full adder circuit to check its functionality, the sum is a (or) b and the carry is a (and) b. So by giving various test inputs like 0 0, 0 1, 1 0, 1 1; the user can verify functionality of a sum and carryif the result is equal to expected output. The 2-bit adder here is a too small circuit, the real large scale system like multiple core processors,
  • 2. EE295 Design Verification Ishwaki Thakkar 010734451 2 hardware embedded in a car, millions of transistors and millions of gates fabricates in mobile processors. Hence, the technique to check the complete functionality of all the Hardware when various test inputs are given to see if circuit as a component, a module and as a whole system is working as expected or not is the main agenda of verification. Manually when the test inputs patterns given to check output it is possible to verify small circuit such as the adder by checking the operations of, but the real time designs are very complex highly integrated application like processors. So manually checking the systems consisting millions of gatesand hundreds of inputs is difficult to do manually as n inputs requires all possible combinations which is 2^n and cantake millions of years totest the circuit. Thus, verification requires automation like CAD tools which automate design, when the user gives specifications, the tool gives output as a final circuit and some algorithms about what patterns to apply at hardware level help to prove mathematically if the design is obeying the design specifications as desired or not. In summary, some automated tools and methodologies exist as the approach of how to verify the circuit. Verification is not testing phase, but a design phase because Testers take system like a black box and there is no need to understand the internal structure of a system to perform Testing. In contrast, the Verification engineer needs to understand all the internal structure of a chip the way designers have designed, model the design according to its behavior, develop the test bench covering test cases such that the whole functionality of a chip along with each and every line of code is checked and passes through all the possible test vector’s possibility and combinations. I order to proceed further, User should know basic VLSI or ASIC design flow to understand the verification process and where does in lies in design flow. ASIC Design Flow [1]
  • 3. EE295 Design Verification Ishwaki Thakkar 010734451 3 Specifications The first step of a design phase is the specification which gives information about basic specifications of design after which implementation begins. The ASIC designer gets the specification from the customer, the specification may be power, chip area or the speed. The top level management decides the micro-architecture or sample architecture. If the design requirements are to build a full adder, the engineer decides top model units such as it requires two half adders, some standard cells. The estimation is done as per design requirements of an area, power and components required are analyzed and returned the analysis to the customer showing the results and the cost and then the design implementation phase begins. RTL Design When the engineer gets all the specifications, the design phase begins. Just like the microprocessor which consists of various blocks such as FPUs, Normal Arithmetic units, cache memories, inbuilt memories. The project lead assigns different modules to lower level engineers. The work is divided into different groups. Each group is building its own RTL level design. This is the first stage of Technology independent work. Once the design of RTL completes, the succeeding phase is simulation. Simulation Once the RTL is designed it is necessary to verify the functionality which needs Simulation. Simulation creates a model which finds out the control that shows the whole behavior of a system. It informs about how it will affect to the surrounding modules which shows if there is need of any change to work for the correct synchronization of all units together in a module or all modules in a system. There are two versions of RTL: Synthesizable and Non-synthesizable. It is advisable to write Synthesizable code. In short, simulation is an abstract representation which predicts the future behavior of a system. Synthesis After the simulation, Test vectors verify the functionality in a Synthesis process. Up to this phase of design, it was Technology Independent design. After this phase, the output resulted in this stage is Technology Dependent design. Synthesis is a three stage process: Translation, Optimization, Mapping to the Particular Technology (32nm,45nm etc.). In synthesis, the tool converts the RTL design implementation usually in Verilog or VHDL is turned into a gates, which forms vg file. Apart from that user needs to specify a constraint for the design called Synopsys Design Constraints like input delay, output delay, clock period, whether the design is leading to a multicycle path, Is there any false path, Latency and Clock Uncertainty. Design to design the constraints may vary. This constraints, user specifies in the.sdc file. When the .vg file and .sdc file is ready the next stage is Design For Testability. Design for Testability In DFT, anEngineer measures the controllability and measurability of a design. How far is the design achievable in terms of error rate? What percentage of a design is coverable is measured in DFT? The file called .atpg is formed. ATPG is Automated Test Pattern Generated file. Thus .vg, .sdc and .atpg files are formed. From simulation to DFT user need to migrate from different tools and this is not possible with one simulation. Industry standard tools for synthesis is Design Compiler from Synopsys. But considering Design For Test,
  • 4. EE295 Design Verification Ishwaki Thakkar 010734451 4 Cadence is used. Up to this phase, the flow is Front-end ASIC Design. The semiconductor design process is categorized into three phases: Front end design, Back end Design and Fabrication. Different designs with EDA tools require some Data Preparation. The following phase after this is a back end. The SoC Requirement for Back end Place and Route and before entering to back-end user requires various data Preparation. The first input is Gate level Verilog netlist .vg from the Design Compiler. The second input is .sdc Synopsis Design constraint which synthesis tool generates. The third input to specify .Lib or library which specifies the technology user are working. To work with 45 nm technology, the user needs the library of 45 nm. The library files are usually slow lib, typical lib and fast lib having different PVTs i.e Power, voltage, and Temperature. Different libs have a different number of components. In different libraries, components are same but PVTs are different. Along with all this three input, the user requires one more information which is a library exchange format (lef) which has metal lead information, wire information, width and height of a sink, the volume of a sink. Hence, we need 4 data: .vg, .sdc, .lib with three different formats and the lef before entering the phase of back-end. Before proceeding further to the Back-end, every process is very important in IC design as it takes too much of time and cost. Hence, it is necessary to check all the constraints and the initial stageto avoid mismatch. There are questions that come up, like whether the RTL engineer gave proper netlist, are there any floating nets or multi-cycle path nets. Such verification is done at the initial stage and there should be a unique file design which ensures that the designer did not use the same component at any other place. Each module has an instance name and should be a unique name, no copy of it should exist in a design. So check design verifies all the constraints, giving information about what is an approximate area, approximate power, the time required. When any information is wrong or has any problem user should inform the RTL engineer who will optimize and resend the final design with updated .vg gate level netlist as well as .sdc. In conclusion, the design process is very delicate and should ensure precise functioning between the front-end and back-end. Once the user receives the new updated: vg and sdc time, which runs perfectly the trailing phase is Designing Phase. For understanding, thetimedesign, user should have knowledge about the various kinds of timepaths. Timedesign invokes the timing engine which classifies the design in a 4 different timing path. Timing Paths [2] Point to point possible paths in a design from where data propagates from one flip-flop to another. Each path has its own start and end points on which it is classified in various path groups, on the basis of which STA creates a report. There are 4 types of Data Path: 1)I/P to Reg (Data path), 2) Reg to Reg (typical Path), 3) Reg to O/p(Data path) 4 ) A Pure Combinational path. (1)
  • 5. EE295 Design Verification Ishwaki Thakkar 010734451 5 (2) (3) (4) When all the values of 4 paths have positive slack it corresponds that there is no bug or problems with the timings. Thus, in an initial phase one has togo through check design and time design. The timing will be verified at 5 different stages which lead to the addition or deletion of components: Pre-place, Pre-CTS, Post CTS, Post Route, Post Silicon. Whenever there is any change in a design, while adding each new component timing is performed so that there is no negative slack and setup or hold violation. When the user confirms the timing of a Pre-Place phase of Floor planning takes place. Floor planning As before buying a house, one checks the architecture and rooms to see how will they fit in their beds, cars, kitchen equipment, and accessories. Similarly, there is some specified silicon and hard macros like RAM, ROM, PLL and soft macros. The information about a number of analog and digital blocks, its placement and identification are floor planning. So Process of identifying the right place for theright component is called floor planning. The designer should keep in notice some constraints while floor planning. Suppose while placing the macros, e.g PLL has 600 connections and 100 interconnections which talks with other module X. Hence, placing X close to PLL is important so an interconnect delay is less and timing is met. This is a significant step to take care. In a floorplan, analog and digital parts should not be together. If the RAMis placed which has some IOs, then placing the RAM very near to external IOs is important. The interconnects should be as less as possible. Also, when the analog block connects to the memory it has some pre-defined constraints as to keep it 5 microns away from the analog component. Hence, it is necessary to fulfil all the requirements of a floorplan. After the floor plan, power plan phase comes into the picture.
  • 6. EE295 Design Verification Ishwaki Thakkar 010734451 6 Power Plan Suppose there are 25k components, there are some power requirements for all components, and of a particular technology like for 180 nm the Vdd can be 1.8V. If there are 20000 components are there in a chip only ring of Vdd is enough for the chip, but if there are 1 million components in a chip one Vdd or one Vgs ring is not enough to trigger the component. To overcome this problem, there are horizontal and vertical strips inserted according to the complexity. The component instead of taking the power from Vdd and Vgs rings can take it from the stripes nearer to them. Hence, every individual component should get the power constraints as required. Place Design The step here is the realplacing of a design for a chip which is the nest stage after power plan. A Standard cell, macros or the modules from various vendors areplaced onto the chip. This is called hard placement, classified as time-based or congestion-based placement. If the time is concerned an engineer applies the time-based model and if one is very much interested in an area, one should go for the congestion based model. After this phase, anengineer againperforms Time Design, which is called Pre-CTS. Before Clock TreeSynthesis once again the user checks the timings and if the slack is positive, the CTS or clock tree synthesis phase takes place. CTS The phase in which the main aim is to obtain 0 skew. For this, one should know what is skew. [3] The clock tree has buffers and the clock edge reach the launch flop through the buffers with a set of wires, the combinational delay requirement also changes when the clock is taken into consideration. The clock which was supposed to reach at 0 ns will now reach at a buffer 1 + buffer 2 delays. 1= Buffer1+Buffer2 delay
  • 7. EE295 Design Verification Ishwaki Thakkar 010734451 7 2= Buffer1+Buffer3+Buffer4 delay So a clock reaching at t=0 will reach at t=0+1 similarly clock generation at T ns will be now t=T+2Delay. So the previous clock cycle was 0 to T is now from 1 to T+2. Hence, the combinational delay requirement modifies equation to (T + 2) - Ts - Tsu> Tcomb + 1 The difference or spatial variation in a temporarily equivalent clock edge is called skew. (T+2)-Ts-Tsu >Tcomb+1 The CTS aims to optimize the skew and tries to get all clocks at the same time by either inserting delay or reducing delay. Hence, additional clock buffers and clock inverters optimize the clock path. Again, after adding additional elements, verification of timing is done, which is Post CTS. Once timing matches with positive slack the next step is Routing. Routing
  • 8. EE295 Design Verification Ishwaki Thakkar 010734451 8 There are two kinds of Routing. One is global routing and one is detailed routing. Global Routing is When there are two locks in a design and there are multiple ways two blocks are connected and the most optimized way to connect the two modules is global routing. According to the details from the global routing, detail routing is performed. On the base of algorithms decide the best-optimized path with interconnects based on which detailed routers will place three nets 1) Signal nets 2) Power Nets 3) clock nets. Once these nets are routed, the routing is completed. Checking the timings again after the routing is called Post-Route static timing analysis checks setup and hold violations. If the timings pass successfully, the timing reports are passed to DRC engineers, who runs caliber tools test for VRC violations or net violations exist. If a net violation exists this is passed through the signal integrity engineer. Signal Integrity There exist some problems which SI engineer solves. When the technology is less than 45 nm there exists the problems like cross talks, electromagnetism, transmission of field, ringing, noise or migration. The SI check for all such problems. After the Signal Integrityis performed, again performing the Static Timing Analysis is called Post SI timing verification. To avoid all these internal effects like cross talks, shields are inserted, so again the Post Timing analysis is performed. Once the user performs Post Timing SI there is a generation of tapeout and sent to the fabrication for manufacturing, which is the last step in a design flow. The Significance of Verification [2] Process of demonstrating functional correctness of design is called Verification. A Way in which user can measure implemented design is functionally correct with the design specification. It is necessary to verify the design. There are certainquestions which are skeptical to the guarantee of design without verification. If there are incorrect specifications or insufficient specifications, the design will be interpreted wrongly, engineers can misinterpret or misunderstand the specification can lead to errors later on. The incorrect interaction between IPs and cores, especially when the design involves multiple IPs and cores when different engineer, design different IP when each engineer interprets differently and when all the IP interact with each other, it can lead to complications. If a user does not verify the result, it can lead to the unexpected behavior of a system. Hence, to avoid all the problems to ensure the functionality is correct in a system and if there is no verification all the hurdles can never lead to the functional expected design of specification. Any bug that escapes the design process to the active silicon of a chip design can be very costly and can result to the race pin of a chip. Verifying the design consumes 70% of a design cycle is spent. As the complexity in a designs keep on increasing day by day it is very hard to make sure that the design is functionally correct to all the specification before the actual silicon is made. Hence, verification is always on the critical path for any product design.
  • 9. EE295 Design Verification Ishwaki Thakkar 010734451 9 Verification classification There are mainly three kinds of verification. The first is the Functional Verification which is a Process which focus more on making sure if the functionality satisfies the design specification. There are certain checks like what are the protocols implemented and verifying features that design supports. Secondly, the Timing Verification Verifies if the actualtiming implementation of design meets with positive slack and behaves as the expected frequency. If there is any change in a design, there is a need of repeating the timing verification to meet timings as seen previously. Performing the Pre-CTS, Post CTS, Pre-SI, etc. is hence important. Thirdly, The Performance Verification is a Process in which focus is on the actual performance goal of a design. E.g. microprocessor design expects n instruction in a cycle, hence, verifying successful execution of expected number of n instruction falls under Performance verification. If the design of memory is capable of read and write every cycle, then the actual cycle should be able to read and write every cycle. Methods of verification In order to verify the designs there are various ways to do verification. They are as follows; Simulation based verification, Emulation/FPGA based Verification, Formal Verification, Semi- Formal verification, HW/SW Co- Verification Verification: Planning, Matrix, Approaches [2] Verification Plan The first step is there is a specification document for verification effort is a document that captures planning that the user executes to perform verification, which ensures to verify all the essential features. The document has all the significant features and information to verify under different conditions along with the details. Whether if theverification of features is an independent manner or in a combined manner in stress conditions. The document also captures the information about how to verify all the features and capture the details on using various Methodologies like Simulation based verification is used or other, formal verification, what are checking methods to be used, are there need to apply coverage or not. The plan also shows what kind of simulation an engineer should apply for verification, implementation of what kind of checkers are there and type of coverage. The plan also has priority for verifying features. When the complexity of design is increasing day by day with the increase in a number of features there exists a priority of one feature over another as to
  • 10. EE295 Design Verification Ishwaki Thakkar 010734451 10 which features should be importantly active and which can be given lower priority. Hencethe verification plan is a guide to the engineer to perform the method having all the important information. Verification Approaches The verification has three approaches. First is a Black Box Verification process where the engineer does not need to have all the knowledge of design Implementation. The only information engineer should know is the features that the design supports. The design is treatedlike a black box during verification. There are pros and cons of this approach. The advantage is the tests are independent of implementation while the major disadvantage of a black box where an engineer doesn’t care about the design Implementation hence there is the lack of visibility and observability. In today’s world where there is high complexity the engineer deals with, it is impractical to apply the black box approach in verification where there is no need to have complete knowledge regarding design implementation. Second is White Box approach where there is a good and deep understanding of design implementation, which has the advantage of full visibility and observability. This has the advantage of an engineer to write a better test case and the stimulus to cover every feature and functionality of a design. During debug, the good observability of engineer helps in debug process. The disadvantage of a white box approach tests is bound to a specific implementation and limits the reuse of test across different project when the implementation changes. Hence, the white box testing can be a good verification approach with increased observability and visibility in comparison to the black box, but limits the reusability of a test case. Third is Grey Box approach is a grey approach which is a compromise between both the approach black and white box. The engineer doesn’t need to know everything about design in depth but there are some features where the verification engineer needs to know. So the grey box is a combination of above two approaches for the specific implementation which gives advantages of a white box of visibility and observability and the advantage of a black box of reusability. Hence, the three approaches have its vivid functionality in different applications.
  • 11. EE295 Design Verification Ishwaki Thakkar 010734451 11 The Level of Verification The diagram shows how the design is partitioned during the product design phase. Every design is partitioned into several systems, sub system, core IP block units and subunits. Design start building up from a bottom up approach. Most sub units and the units needed for the overall design to work in parallel. Once the design unit are in place are put together under core blocks and then the core blocks are put together to form sub system built to form system on chip which in turn makes system which makes a board on chip assembled. As the design is partitioned, verification is also partitioned at each multiple levels. Each level of verification is suited for a specific objective. Lower level of verification is the verification of unit or sub unit or core IP blocks in a stand-alone environment. In a unit or subunit level there is no need to build a proper test bench or verification environment, Ad-Hoc basic operations can work like forcing certain input signals and checking the design output behavior as per the specification. The more verification an engineer does at lower level; more arethe chances to find the most bugs in parallel to thedesign which saves the time in early level of verification. Multiple verification of such units built in a system or subsystem to the product level verification takes place along with design development starting from beginning of design and focusing on different levels of verification. Verification Matrix The verification matrix is the matrix that helps the engineer tracks towards the completion of a plan and the quality of verification. Some matrix of them are; first, Measurements Matrix: Features to verify, a stimulus to verify, how many tests are being written, stimulator generator is written, passed tests rate, failure rates. The
  • 12. EE295 Design Verification Ishwaki Thakkar 010734451 12 matrixhelps engineer tracktowards the actualpresent situation to what was planned for thesituation. Second, Historical trends Matrix: This matrix grows as the number of projects increases. The matrix of a project regarding the past can help the user determine how the present project is trending which can help compare the quality of a current project to the past if it is better or worse in comparison to the past project. Verification Methods [2] Some verification methods that users most commonly used are Simulation Based Verification The diagram shows the design Implementation, where a software simulator surrounds the design. This is most commonly used design verification. A Test vector generator that generatesan input stimulus to the design and the design under the test is simulated using the simulator. There is a golden reference model, a scoreboard or a checker to generatethe output treatedas golden output, which compare with the actualoutput coming from the design implementation to ensure the correctness of a design. Based on complexity of design implementation the test vector generation can be simple as the directed test generator or as complex as the random reconstrained test generator. Approaches like Assertion and coverage matrix ensure if verification is complete or not. Formal Verification Formal verification is the method through which user proves or disproves a design implementation against a formal specification or property. This method uses mathematical reasoning or algorithms to prove a design against the formal specification. It exhaustively explores all possible input values over time, exercises various state spaces of a design. This verification method works well when the designs are small and the number of inputs, outputs and states are small. With the increasing design states, the formal verification may not be that effective. The diagram shows how the equivalence checking is done. RTLA/ GatesA is a reference or a golden design, during implementation the design is RTLB/ GatesB which is a target design which is a modified version for a specific purpose or with different library, equivalence checking of A and B are done to check if they are functionally equivalent. This doesn’t guarantee both are correct with respect to each other, but guarantees two kinds of implementation remain functionally equivalent.This is another kind of implementation where another reference is checked for RTLC/ GatesC to check the equivalence of a design, here design C and B are checked.
  • 13. EE295 Design Verification Ishwaki Thakkar 010734451 13 Model checking is an another way of Formal Verification, which is commonly used is Model checking. It is a way which exhaustively search FSM for stateor property violation for nstate machine. This model is used when the design is mostly based on a Finite Statemachine. In this approach, there are two input to algorithm one of which is a finite-state transition system representing the implementation and other input is a formal property representing the specification. The engineer gives these inputs to the algorithm which exhaustively explores all the input combinations, which can violate the FSM model if it doesn’t find any input value combination that violates the finite state machine model then the system is formally verified. This diagram explains the model checking. The FSM model is shown and the formal property is the specification of a same design which is input to the model checking model which has formal verification tools. The tool algorithmically explores all thepossible input combinations for the difference in space and the design and come up with the counter example and combination that can fail the design. If the model doesn’t fail for any combination it is formally verified. There are many advantages of Formal Verification like it Covers exhaustive state space which is hard in a simulation where the user manually creates a test. There is no need to generate any input stimulus since tools automatically generate exhaustive stimulus. There is no need to generate expected output sequences like
  • 14. EE295 Design Verification Ishwaki Thakkar 010734451 14 checkers, a reference, and a golden output. There is a guaranteeof Correctness of a design mathematically. It has advantages but it has the disadvantages too. The disadvantages in a formal verification method are that they are not scalable for lots of designs. With each addition of flip-flop, the state space gets doubled. Verification tools have to exhaustively cover more combination hence, it is limited to a design with small state space like the interface and small FSMs. Scalability is not obtained in Larger design. For every specification, theuser needs to translatethespecifications into some kind of property specification language, which is difficult as state space keeps increasing. To verify the properties holds true under all the input and state condition is difficult. Semi-Formal Verification The semi-formal verification is best of both simulation and formal verification. Design with a larger specification where the formal verification cannot rely alone, the simulation method of verification is used to cover critical parts of design initially after which the user does exhaustive formal verification around those critical design to cover statespace. This approach helps to cover larger design state space faster and helps to find bugs deep in a design as a result of formal verification around the critical design hence useful for bug hunting. Assertion Assertion is a statement about a design’s intended behavior which must be verified. A way of capturing design intention or a part of a design specification in a form of property which can be used in dynamic simulation or formal verification to make sure that specific intention is met or not met. There are many advantages of assertion based verification. It has improved Observability and debug ability because the design intention is captured in a property and user know when the specific property came across during debug. The correct usage checking improved integration, different people build multiple units, during integration of this unit into top level design, each assertion for the unit helps to ensure that at the integration level design still works properly. Improves verification efficiency, lots of time spent in debug or testing can be saved. Improves communication as it is a way of capturing in documentation. Used in both static verification as well as dynamic simulation. There aretwo types of Assertion one is Immediate Assertion, which at any given time user creates assertion or property. E.g. Assert (A==B) else $error(“Wrong”); At a given point of time if A and B are not equal, then it violates the design intention, and it gives the error. Thesecond type of assertion is Concurrent Assertion, which creates an assertion which can occur over the span of time, hence that happens at every clock. E.g. property p1; @(posedge clk) disable iff(Reset) not b ##1c; endproperty assert property (p1) else $error (“B ##1C failed”); It shows property that every pos edge of clock disables if reset is not asserted and b is not 1 for one cycle than c has to be 1 for one cycle if this does not hold true, it is a failure.
  • 15. EE295 Design Verification Ishwaki Thakkar 010734451 15 Assertion Based Verification The assertion based verification is a method of verification where an assertion is used in Formal or Dynamic verification. In formal verification it specifies all thefunctional requirements and the behavior of a design. User can write all the formal properties using assertion. This method uses a procedure to prove that all the specification properties are correct. User can also use the assertion in dynamic simulation. Checkers doing simulation and specified properties can be the condition of coverage, hence the user can verify dynamic simulation conditions are met. Hence, the user can use Assertion based verification in both formal verification as well as dynamic simulation. In summary, simulation based verification is most commonly used to find bugs. User can use Formal Verification in selected areas like Finite state designs specially when the states are small. The Assertions are of significance to both design as simulation based design and formal verification. Directed verification vs Random Verification [5] There aretwo ways in which verification can be performed. First, DirectedVerification is the approach of a test created for directed feature or every feature in a design. If there are multiple features that have to operate in a different condition, the user has to create targeted feature each has to be operated on different condition. This approach works well when the design is in an early stage of development and is not mature when the design scope is limited, and it is possible to check each and every possible condition of a design. Second, Random Verification is an approach, where a test is not created for every feature, but the test is generated for multiple tests in a random test or random generatorthat createsmultiple scenarios in a random manner. Both the approach has advantages and disadvantages. If the design complexity is high and the design is mature during later stages of design development, the random approach of verification is useful. The directed Verification can only cover scenarios through thought planning, so during the verification planning stage whatever the scenario designer thinks through a test it and only the features which user can test which user think. This approach has high maintenance cost. The directed approach works well when the condition state is finite and can plan for all the test scenarios. When the design stage is huge, it is not possible to cover all the scenarios in a given time. The good part is no need for extensive coverage coding as it guarantees the test hits all the scenarios so no effort is needed to check if all scenarios are hit. The pure random approach depends on the random test generator that can randomly hit all the scenarios. In order to make sure all the conditioned space is covered, it is necessary to run in an infinite number of cycles to ensure all state conditions is covered. The good part in this approach, there is a less user control and a user depends more on the generator which generates random scenarios. In order not to let the generator be purely random, there is an approach which lies in between this two approach which is Constrained Random Verification which has constrained randomness tothe criticalscenarios to the design. This approach has the high ramp up time so the user has to understand design scenarios and code random intelligent generators. To measure constrained randomness and completeness user needs to measure coverage matrix. The good part is the approach gives deep control which can hit all different state space in a design. The approach is the best balance between engineer timeand thecomputing time, user don’t have to run too many tests and depend on too many randomness’s as it is constrained the scenarios can hit faster. This can be static or dynamic randomness which means the either built random test upfront or built intelligence in a generator dynamically creates randomness to hit more scenarios. Hereconstrained random is considered best for complex designs as it is not possible to create all the test scenarios and user can also not rely on pure randomness in generator to get everything out of luck.
  • 16. EE295 Design Verification Ishwaki Thakkar 010734451 16 The first diagram show percentage of functional coverage hit versus the time taken. The directed testing approach with every test case, the user can hit certain functional coverage, hence it is a step improvement till 100% coverage to hit. Whereas the constrained random approach has lots of effort to put to build the test generator. When a user may not be able to really test anything, hence it has an initial ramp up time, but once the engineer generates initial test, the user achieves quick jump. But, again, there are cases where again user doesn’t get certain scenarios which, when covered again gets a quick jump, hence this follows when the scenarios are not covered and then the quick jump. This approach can reach 100% coverage faster than the directed approach. The second graphs also show the comparison of an engineer time versus compute cycle time spent for all the three approaches. In a process of a directed test, initially a user builds each test for every scenario, and as the time goes user builds bigger tests and hence user spends a lot of time and compute the cycle. In a completely random test, a lot of engineer time goes into coding coverages and building randomness which runs through billion cycles. Whereas directed random approach is a balance between both, where initially engineer spends lots of time to come up with constraint and coverage and then use compute cycle effectively to get better coverage. Hence, the directed random approach has relatively low engineering cost as well as compute cost Coverage The Metric or completeness of verification is called coverage. The reason to do coverage is that it is most commonly needed for complex designs. User does verification based on samples and not on the extensive verification of complete space. If there are n space it is almost impossible to run 2n possible tests to cover full space which will take n million years to complete entire verification. Hence user usually follows constrained
  • 17. EE295 Design Verification Ishwaki Thakkar 010734451 17 random approach where user creates test around critical scenarios. User needs to know exactly what areas of design needs verification. There are two types of coverage which helps identify what kind of design area are verified well. Code Coverage This coverage gives matrix of analysis of how good the code is covered during code simulation. There are various code coverages like Statement coverageis a matrix shows how much source code user executes during different tests. Branch coverage shows weather every control structure evaluate to true or false which are branch condition. Examples are If, case, forever, for, while, loop, case statements; test covers branching conditions or not. Condition coverage gives information of coverage of weather every test covers Boolean sub- expression to true or false. Expression coverage gives the information of RHS of an assignment like x<= (not a) xor b; the coveragewill give information of all combination of xor and not are covered or not. Toggle coverage covers logic node transition report. Standard toggle coverage gives information on every node transit from 0 to 1 and 1 to 0 while extended togglecoverage gives information of all the four stages of 0<->1, z<->1 and z<- >0 transitions. FSM coverage gives coverage on different state transitions and different arc coverage. Functional Coverage Functional coverage is different than the code coverage it covers the functionality of a design under test directed from thedesign specification and tools cannot generatean automatic functional coverageunlike code coverage. User need to createfunctional coverage monitor which the tool uses to extract coverageduring the simulation. This checks if all the possible DUT input operations are injected, if all the possible DUT output or functional responses seen on every output port. Internal DUT coverageof all interested design events arebeing verified like FIFO are full or not, DUT covers arbitration mechanisms. The diagramshows the coverage driven verification. From a verification plan a set of matrixuser createswhich inform about what scenarios are to be covered and then generate random tests which user runs and collect report to ensure the coveragegoals. If it meets goals theverification is complete, if not then user identify what constraints didn’t meet coverage and generatenew tests or re enhance test generators in order to hit all those scenarios, coverage matrix can also be enhanced to get complete verification flow this process is repeated to get complete verifications.
  • 18. EE295 Design Verification Ishwaki Thakkar 010734451 18 Latest Trends of verification [6] Hardware Accelerated Simulation/ Emulation An alternative method used specially at the higher level of verification like a system or full chip verification. This is because simulation based verification tends to slow down as the size of design increases hence when the full chip or SoC where the Design size are huge the simulation runs very slow and there is need to use alternative method. Simulators is still a simulator that runs at a host CPU which emulates the signal at every cycle or every event change in signal. At higher level as alternative, user try to move the time consuming part to the hardware. As the diagram shows the set up modules representing the design implementation and the test bench comprises all the behavior components like stimulus generators or the checkers. The most time consuming part of a simulation model is design implementation in terms of an evaluation of modules at every event change or signal change or every cycle. In this model the most time consuming modules and synthesize compile to real hardware FPGA which runs much faster on real components while still the test bench component continues to run on the Host CPU, then device some ways of communicating between the test bench components hitting on the simulator talking to the design along with the actual hardware communicating to the design. This approach gains the speed of minimum 10x to 20x.
  • 19. EE295 Design Verification Ishwaki Thakkar 010734451 19 Challenges There are many challenges of this method. It Improves the speed by moving hardware components to a real system like FPGA, but degrades the communication of HW-SW. If there is a communication of Hw and Sw at every event change and every event this model may not be the benefiting the speed. Abstracting HW-SW communication is at transaction level rather than at every clock cycle and that is how the communication is no more at every cycle, but at the transition boundary which helps increasing speed. This are some challenges which has to be overcome. Hence, to overcome these challenges, Hardware emulation is used. HardwareEmulation is a full mapping of a hardware which can be the array of FPGA which is like a real target system which helps in speed up to 1000x. There are several approaches for full hardware emulation like synthesizing the test bench if it is totally synthesizable on to the emulator or setup configuration which looks like the real hardware system and to use on a real stimulus generatoror real silicon testing. But in either case debug is still a challenge and may not gain full visibility on the internals of design. Usually, HW or SW co-verification or together is mostly in use. HW/SW Co-verification SOC design involves hardware development phase or software development phase. At a high-level user come up to a system specification as to what the system has to do which during the system design has to classify between what hardwarehas to do and what the software has todo. On one side for thehardware development where on the basis of specification, an engineer implements the actual design in RTL to the gates. While on other side software development based on specification like what has togo todifferent levels like OS and driver software and user develop a software code. This two have to be working on the real system. In both phases, hardware verification involves functional and gate level verification to ensure that the hardware is working correctly, in parallel software verification checks if it works stand alone. Traditionally user never checks if both hardware and software work correctly together before going to actual silicon. When the design complexity is increasing day by day it is becoming important to ensure if the hardware and software work together on the actual silicon to avoid silicon bugs. Hence, the HW/SW verification is becoming more and more important. CO- verification helps project complete in a short time and gain higher confidence in both HW and SW quality testing.
  • 20. EE295 Design Verification Ishwaki Thakkar 010734451 20 Conclusion In this whole paper user learnt what is verification and why to do verification in SOC or chip design flow. Verification is a process by which user measures the functional correctness of a design and if it is not ensured that design is functionally correct it can lead to bugs in a silicon like race pins costing millions of dollars. Also, verification is becoming the most significant part of a design flow. Where the verification lies in the design flow like Pre-CTS, Post-CTS, Pre-Si, whenever thereis a changein a design. Verification goes parallel with the design phase at unit, sub-unit, system level. Different Methods like Simulation/ Formal / Assertion-based verification is there. There was a comparison between Directed Vs Random Verification methods and how there is a way out with advantages of both the approaches as a constrained Random Verification. The reader also saw what is coverage and why the coverage is important in a random or constrained-random testing. The latest trends of Emulations like Hardware accelerated Emulations to check how the system behaves in real environment, and HW/SW co-verification and its significance for the proper synchronization and working of both hardware and software together. Hope that gave the reader basic verification concepts. References [1]https://www.youtube.com/watch?v=Y2PQzc9Gqsw [2] "SOC Verification Using SystemVerilog - Udemy." Udemy. Web. 10 May 2016. <https://www.udemy.com/soc-verification-systemverilog>. [3] EE-287 Paper of How to time logic- Ishwaki Thakkar