SlideShare a Scribd company logo
1 of 45
Dr. Shivananda (Shivoo) R Koteshwar
Design Group Digital Implementation Site Leader and R&D Head
Synopsys India
LINKEDIN : https://in.linkedin.com/in/shivoo2life
FACEBOOK : shivoo.koteshwar
SLIDESHARE : www.slideshare.net/shivoo.koteshwar
Dayananda College, Bengaluru
March 2019
1. Basics
2. Verification Challenges
3. VerificationTechnologies
4. Verification Strategies
5. Verification Methodologies
6. Skills needed for today’s corporate job
7. Q&A
• Design synthesis:
▪ Given an I/O function, develop a procedure to manufacture a device using
known materials and processes
• Verification:
▪ Predictive analysis to ensure that the synthesized design, when
manufactured, will perform the given I/O function
• Test:
▪ A manufacturing step that ensures that the physical device, manufactured
from the synthesized design, has no manufacturing defect.
3
4
 Goal:Validate a model of the design
 Testbench wraps around the design under test (DUT)
 Inputs provide (deterministic or random) stimulus
▪ Reference signals: clock(s), reset, etc.
▪ Data: bits, bit words
▪ Protocols: PCI, SPI,AMBA, USB, etc.
 Outputs capture responses and make checks
▪ Data: bits, bit words
▪ Protocols: PCI, SPI,AMBA, USB, etc.
Design
Under
Test
Testbench
inputs outputs
Verification is the process of verifying the transformation steps in the design flow are
executed correctly.
Algorithm
Architecture/
Spec RTL Gate GDSII ASIC
End
productIdea
Product
Acceptance
Test
Transformations
C-Model
Spec Acceptance
Review Simulation/
Code Review
Formal Functional/
Timing Verification
ATE
Sign-Off Review
 Ensure full conformance with specification:
▪ Must avoid false positives (untested
functionalities)
???Pass
Fail
Good Bad(bug)
RTL code
Tape out!
Debug
testbench
Debug RTL
code
Testbench
Simulation
result
False positive
results in shipping a
bad design
How do we achieve this goal?
 Simulators are the most common and familiar verification tools. They are named simulators because
their role is limited to approximating reality.
 A simulation is never the final goal of a project. The goal of all hardware design projects is to create real
physical designs that can be sold and generate profits.
 Simulators attempt to create an artificial universe that mimics the future real design. This lets the
designers interact with the design before it is manufactured and correct flaws and problems earlier
 Simulators are only approximations of reality
▪ Many physical characteristics are simplified - or even ignored - to ease the simulation task. For example, a digital
simulator assumes that the only possible values for a signal are ‘0’, ‘1’, X, and Z. However, in the physical and
analog world, the value of a signal is a continuous: an infinite number of possible values. In a discrete simulator,
events that happen deterministically 5 ns apart may be asynchronous in the real world and may occur randomly
 Simulators are at the mercy of the descriptions being simulated
▪ The description is limited to a well-defined language with precise semantics. If that description does not
accurately reflect the reality it is trying to model, there is no way for you to know that you are simulating
something that is different from the design that will be ultimately manufactured. Functional correctness and
accuracy of models is a big problem as errors cannot be proven not to exist.
9
 Simulation requires stimulus
▪ Simulators are not static tools. A static verification tool performs its task on the design
without any additional information or action required by the user. For example, linting tools
are static tools. Simulators, on the other hand, require that you provide a facsimile of the
environment in which the design will find itself. This facsimile is often called a testbench,
stimulus.
▪ The testbench needs to provide a representation of the inputs observed by the design, so
the simulator can emulate the design’s responses based on its description
 The simulation outputs are validated externally, against design intents.
▪ The other thing that you must not forget is that simulators have no knowledge of your
intentions. They cannot determine if a design being simulated is correct. Correctness is a
value judgment on the outcome of a simulation that must be made by you, the designer.
▪ Once the design is submitted to an approximation of the inputs from its environment, your
primary responsibility is to examine the outputs produced by the simulation of the design’s
description and determine if that response is appropriate.
10
 Simulators are never fast enough
▪ They are attempting to emulate a physical world where electricity travels at the speed of light and transistors switch over
one billion times in a second. Simulators are implemented using general purpose computers that can execute, under ideal
conditions, up to 100 million instructions per second
▪ The speed advantage is unfairly and forever tipped in favor of the physical world
 Outputs change only when an input changes
▪ One way to optimize the performance of a simulator is to avoid simulating something that does not need to be simulated.
▪ Figure shows a 2-input XOR gate. In the physical world, if the inputs do not change (a), even though voltage is constantly
applied, output does not change Only if one of the inputs change (b) does the output change
 Change in values, called events, drive the simulation process
▪ The simulator could choose to continuously execute this model, producing the same output value if the input values did not
change.
▪ An opportunity to improve upon that simulator’s performance becomes obvious: do not execute the model while the inputs
are constants. Phrased another way: only execute a model when an input changes. The simulation is therefore driven by
changes in inputs. If you define an input change as an event, you now have an event-driven simulator
11
 Cycle-based simulations have no timing information
▪ This great improvement in simulation performance comes at a cost: all timing and delay
information is lost. Cycle-based simulators assume that the entire design meets the set-up
and hold requirements of all the flip-flops.
▪ When using a cycle-based simulator, timing is usually verified using a static timing analyzer
 Cycle-based simulators can only handle synchronous circuits
▪ Cycle-based simulators further assume that the active clock edge is the only significant
event in changing the state of the design. All other inputs are assumed to be perfectly
synchronous with the active clock edge.Therefore, cycle-based simulators can only
simulate perfectly synchronous designs
▪ Anything containing asynchronous inputs, latches, or multiple-clock domains cannot be
simulated accurately.,The same restrictions apply to static timing analysis.Thus, circuits
that are suitable for cycle-based simulation to verify the functionality, are suitable for
static timing verification to verify the timing
12
 To handle the portions of a design that do not meet the requirements for cycle-based simulation,
most simulators are integrated with an event-driven simulator
 As shown, the synchronous portion of the design is simulated using the cycle-based algorithm, while
the remainder of the design is simulated using a conventional event-driven simulator
 Both simulators (event-driven and cycle-based) are running together, cooperating to simulate the
entire design
◼ Other popular co-simulation environments provide VHDL and Verilog, HDL and C, or digital and analog co-
simulation
13
Design Errors Simulation –Practical Problem
 Coverage
▪ Code Coverage
▪ Statement or Block Coverage
▪ Path Coverage
▪ Expression Coverage
▪ Functional Coverage
 Verification languages can raise the level of abstraction
 Best way to increase productivity is to raise the level of abstraction used to
perform a task
 VHDL andVerilog are simulation languages, not verification languages
 VHDL simulation tools can automatically calculate a metric called code coverage
(assuming you have licenses for this feature).
 Code coverage tracks what lines of code or expressions in the code have been
exercised.
 Code coverage cannot detect conditions that are not in the code
 Code coverage on a partially implemented design can reach 100%. It cannot detect
missing features and many boundary conditions (in particular those that span more
than one block)
 Code coverage is an optimistic metric. In combinational logic code in an HDL, a
process may be executed many times during a given clock cycle due to delta cycle
changes on input signals. This can result in several different branches of code being
executed. However, only the last branch of code executed before the clock edge
truly has been covered.
 Hence, code coverage cannot be used exclusively to indicate we are done testing. 16
 Functional coverage is code that observes execution of a test plan. As such, it is code you write to track whether
important values, sets of values, or sequences of values that correspond to design or interface requirements,
features, or boundary conditions have been exercised
 Specifically, 100% functional coverage indicates that all items in the test plan have been tested. Combine this
with 100% code coverage and it indicates that testing is done
 Functional coverage that examines the values within a single object is called either point coverage or item
coverage
▪ One relationship we might look at is different transfer sizes across a packet based bus. For example, the test plan may
require that transfer sizes with the following size or range of sizes be observed: 1, 2, 3, 4 to 127, 128 to 252, 253, 254, or 255.
 Functional coverage that examines the relationships between different objects is called cross coverage. An
example of this would be examining whether an ALU has done all of its supported operations with every different
input pair of registers
 VHDL’s Open SourceVHDLVerification Methodology (OSVVM) provides a package, CoveragePkg, with a
protected type that facilitates capturing the data structure and writing functional coverage
 Functional Coverage provides additional supporting data that the design is tested. It’s a supplement to Primitive
testing directed, algorithmic, file based, or constrained random test methods
17
 Completeness does not imply correctness:
▪ Code coverage indicates how thoroughly your entire verification suite exercises the source code. I does not provide an indication, in any way,
about the correctness of the verification suite
▪ Code coverage should be used to help identify corner cases that were not exercised by the verification suite or implementation-dependent
features that were introduced during the implementation
▪ Code coverage is an additional indicator for the completeness of the verification job. It can help increase your confidence that the verification
job is complete, but it should not be your only indicator.
 Code coverage lets you know if you are not done: Code coverage indicates if the verification task is not complete through low
coverage numbers. A high coverage number is by no means an indication that the job is over
 Some tools can help you reach 100% coverage: There are testbench generation tools that automatically generate stimulus to
exercise the uncovered code sections of a design
 Code coverage tools can be used as profilers:When developing models for simulation only, where performance is an important
criteria, code coverage tools can be used for profiling.The aim of profiling is the opposite of code coverage. The aim of profiling
is to identify the lines of codes that are executed most often.These lines of code become the primary candidates for
performance optimization efforts
18
 It is quite possible to achieve 100% code coverage but only 50% functional coverage
▪ Here the design is half complete
 Equally, it is possible to have 50% code coverage but 100% functional coverage
▪ Indicates that the functional coverage model is missing some key features of the design
▪ Indicates the design contains untested code that is not part of the test plan
▪ This can come from an incomplete test plan, extra undocumented features in the design, or case statement
others branches that do not get exercised in normal hardware operation
▪ Untested features need to either be tested or removed
▪ As a result, even with 100% functional coverage it is still a good idea to use code coverage as a fail safe for the
test plan
 Code Coverage is quantitative coverage and functional coverage is qualitative
coverage
 The two coverage approaches are complementary, and high quality verification will
benefit from both.
19
Test Done =Test Plan Executed and All Code Executed
REF: https://www.doulos.com/knowhow/sysverilog/uvm/easier_uvm_guidelines/coverage-driven
20
 IP / Module LevelVerification
Study DUT and related
Specification
Gather requirements for
features to be verified and set priorities
Review Requirements with IP Architect/Designer
(Requirements should cover all parameters for module)
Design Test Infrastructure on paper / document
(includes re-usable verification components)
Review TB Architecture with Verification Team
Design Test Infrastructure
(includes re-usable verification components)
Code Testcases as per Test-Bench Plan. Also code
Functional Coverpoints / Assertions
Complete Verification such that Functional Coverage
100% and Code Coverage numbers are logged.
Review Code Coverage numbers with Designer
to eliminate dead code possibilities.
Sign off Module Level Verification by checking in
the files having relevant data such as logs.
 SoC LevelVerification
Study SoC and related
Specification
Gather requirements for
critical data paths set priorities
Review Requirements with IP Architect/Designer
(Requirements should cover all parameters for module)
Design Test Infrastructure on paper / document
Identify testcases that can be re-used
(includes re-usable verification components)
Review TB Architecture with Verification Team
Design Test Infrastructure
(includes re-usable verification components)
Code Testcases as per Test-Bench Plan. Also code
Functional Coverpoints / Assertions
Complete Verification such that Functional Coverage
100% and Code Coverage numbers are logged.
Review Code Coverage numbers with Designer
to eliminate dead code possibilities.
Sign off SoC Verification by checking in
the files having relevant data such as logs.
TESTBENCH ENVIRONMENT /
ARCHITECTURE
 Accelera Systems Initiative is an independent, not-for profit
organization dedicated to create, support, promote, and
advance system-level design, modeling, and verification
standards for use by the worldwide electronics industry
 www.accelera.org
24
 Verification languages can raise the level of abstraction
▪ Best way to increase productivity is to raise the level of abstraction used to perform a
task
 VHDL andVerilog are simulation languages, not verification languages
▪ Verilog was designed with a focus on describing low-level hardware structures. It does
not provide support for high-level data structures or object-oriented features
▪ VHDL was designed for very large design teams. It strongly encapsulates all information
and communicates strictly through well-defined interfaces
▪ Very often, these limitations get in the way of an efficient implementation of a
verification strategy. Neither integrates easily with C models
▪ This creates an opportunity for verification languages designed to overcome the
shortcomings ofVerilog andVHDL. However, using verification language requires
additional training and tool costs
 Proprietary verification languages exist
▪ e/Specman fromVerisity, VERA from Synopsys, Rave from Chronology etc
25
 Provides a reusable ,standard infrastructure in form of base
classes which are pre-defined.These can be extended and
enhanced as per user needs
 Defines rules to create behavioral models also known as
Verification Components (OVC/UVC)
 Defines standards for higher level of modelling input stimulus
usingTransaction Level Modelling (TLM)
 Defines rules to have a layered structure of test-benches
 In summary Methodology = standardization of the way of
creating complex test-benches with constrained random test-
vectors.
 OVM
▪ OpenVerification Methodology
▪ Derived mainly from the URM (Universal Reuse Methodology) which was, to a large part, based on the eRM (e Reuse Methodology) for the e
Verification Language developed byVerisity Design in 2001
▪ The OVM also brings in concepts from the AdvancedVerification Methodology (AVM)
▪ SystemVerilog
 RVM
▪ Reference Verification Methodology
▪ Complete set of metrics and methods for performing Functional verification of complex designs
▪ The SystemVerilog implementation of the RVM is known as theVMM (Verification Methodology Manual)
 OVL
▪ OpenVerification Language
▪ OVL library of assertion checkers is intended to be used by design, integration, and verification engineers to check for good/bad behavior in
simulation, emulation, and formal verification.
▪ Accellera - http://www.accellera.org/downloads/standards/ovl/
 UVM
▪ Standard Universal Verification Methodology
▪ Accellera - http://www.accellera.org/downloads/standards/uvm
▪ SystemVerilog
 OS-VVM
▪ VHDL
▪ Accellera
OVC: OVMVerification Component
UVC: UniversalVerification Component
 C type data types like int, typedef, struct, union, enum
 Dynamic data types : struct, classes, dynamic queues, dynamic
arrays
 New operators and built in methods
 Enhanced flow control like, foreach, return, break, continue
 Inter-process synchronization – Semaphores, Mailboxes, Event
Extension
 Assertions and Coverage
 Clocking Domains
 Direct Programming Interface (DPI) -VPI
 Hardware specific procedures
REF: http://www.eetimes.com/document.asp?doc_id=1277143
 UVM (UniversalVerification Methodology) is a SystemVerilog
language basedVerification methodology
 UVM consists of a defined methodology for architecting modular
testbenches for design verification.
 UVM has a library of classes that helps in designing and
implementing modular testbench components and stimulus.This
enables re-using testbench components and stimulus within and
across projects, development ofVerification IP, easier migration
from simulation to emulation etc.
 Relies on strong, proven industry foundations .The core of its
success is adherence to a standard (i.e. architecture, stimulus
creation, automation, factory usage standards etc.)
29
 Following can be automated using UVM
▪ Coverage DrivenVerification (CDV) environments
▪ Automated Stimulus Generation
▪ Independent Checking
▪ Coverage Collection
30
SVTestbench Architecture
UVMTestbench Architecture
31
• Syntax
• RTL
• OOP
• Class
• Interface
System
Verilog
Language
• Constrained Random
• Coverage Driven
• Transaction Level
• Sequences
• Scoreboards
Verification
Concepts
• Base Classes
• Use Cases
• Configuration-db
• Phases
Methodology
32
 SystemVerilog Language syntax & semantics are pre-requisite
 All SystemVerilog experience is directly relevant for UVM
(design/RTL, AVM,VMM, etc.)
 But be aware the verification part of language is much bigger than
that used for design!
▪ Design: RTL, Blocks, Modules,Vectors,Assignments,Arrays etc.
▪ Verification: Signals, Interfaces Clocking-block, scheduling, functions,
tasks, OOP, class, random constraint coverage, queues etc.
 All verification experience is directly transferrable to UVM
33
34
 Modularity and Reusability –The methodology is designed as modular components (Driver,
Sequencer, Agents , env etc) to enable re-use at different levels of verification and across
projects
 Separating Tests fromTestbenches –Tests in terms of stimulus/sequencers are kept separate
from the actual testbench hierarchy and hence there can be reuse of stimulus across different
units or across projects
 Simulator independent –The base class library and the methodology is supported by all
simulators and hence there is no dependence on any specific simulator
 Sequence based Stimulus generation: There are several ways in which sequences can be
developed which includes randomization, layered sequences, virtual sequences etc which
provides a good control and rich stimulus generation capability.
 Configuration mechanisms simplify configuration of objects with deep hierarchy. The
configuration mechanism (using UVM config data base) helps in easily configuring different
testbench components based upon verification environment using it, and without worrying
about how deep any component is in the testbench hierarchy
 Factory mechanisms simplifies modification of components easily. Creating each
components using factory enables them to be overridden in different tests or environments
without changing underlying code base. 35
 Steep learning curve: For anyone new to the methodology,
the learning curve to understand all details and the library is
very steep.
 Still developing and not perfect/stable: The methodology is
still developing and has a lot of overhead that can sometimes
cause simulation to appear slow or probably can have some
bugs
36
37
ReferenceVerification Methodology
E Reuse Methodology
Universal Reuse Methodology
AdvancedVerification Methodology
Verification Methodology Manual
OpenVerification Methodology
37
 Run the most important tests first when you get a new build
 Do not start over on your test pass every time you receive a new
build
 Regression tests that have been run already many times are
unlikely to reveal new bugs. If your testcase fully automated, by all
means, run all of them for each build.
 Prioritize tests into “Must-Pass” types with a more focused list of
tests which can reduce the time of the regression. Major builds
will warrant running all testcases.
 Automate whenever it makes sense to do so.
 Automation is a means of reducing manual effort in running
repetitive tasks such as regressions.
 Automation can be done also in creating test-benches so that
a standard infrastructure is maintained across the team.
 This can be done using PERL scripts.
 Why use PERL ?
- Free and works with most UNIX and Linux versions
- Ease to work with , smaller learning curve.
- Advance PERL with OOPs available makes scripting easier
 Scripting
▪ PERL, Python, C++
 Languages and Methodologies
▪ Verilog,VHDL, SystemVerilog, UVM
 Problem Solving and debugging Skills
 Diligent and Methodical
 Documentation Skills
 Reading Skills!
 Be up to date on standards and adjacent technologies
 Don’t be a generalist ..Be a specialist!
 Assess yourself : http://www.slideshare.net/RamdasMozhikunnath/exercises-on-advances-
in-verification-methodologies
50 Shades of Life
50 Colours of Love
 Belakoo Education Trust offers free quality education for
underprivileged children. We run STEAM programs for
Government School kids substituting their learnings at school.
 We participate in Skill Development Program for students under
various running central/stage level schemes
https://www.facebook.com/belakootrust/
All pictures are from flickr.com with either
copyright or with common creatives
Visit my slideshare to view all
these presentations
Dr. Shivananda (Shivoo) R Koteshwar
Group Director, Synopsys
LINKEDIN: https://in.linkedin.com/in/shivoo2life
shivoo.koteshwar@gmail.com/ Facebook: shivoo.koteshwar
SLIDESHARE: www.slideshare.net/shivoo.koteshwar

More Related Content

What's hot

System verilog coverage
System verilog coverageSystem verilog coverage
System verilog coveragePushpa Yakkala
 
Verification challenges and methodologies - SoC and ASICs
Verification challenges and methodologies - SoC and ASICsVerification challenges and methodologies - SoC and ASICs
Verification challenges and methodologies - SoC and ASICsDr. Shivananda Koteshwar
 
System verilog control flow
System verilog control flowSystem verilog control flow
System verilog control flowPushpa Yakkala
 
SystemVerilog based OVM and UVM Verification Methodologies
SystemVerilog based OVM and UVM Verification MethodologiesSystemVerilog based OVM and UVM Verification Methodologies
SystemVerilog based OVM and UVM Verification MethodologiesRamdas Mozhikunnath
 
System Verilog Functional Coverage
System Verilog Functional CoverageSystem Verilog Functional Coverage
System Verilog Functional Coveragerraimi
 
System verilog important
System verilog importantSystem verilog important
System verilog importantelumalai7
 
Verification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathVerification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathRamdas Mozhikunnath
 
Basics of Functional Verification - Arrow Devices
Basics of Functional Verification - Arrow DevicesBasics of Functional Verification - Arrow Devices
Basics of Functional Verification - Arrow DevicesArrow Devices
 
Advances in Verification - Workshop at BMS College of Engineering
Advances in Verification - Workshop at BMS College of EngineeringAdvances in Verification - Workshop at BMS College of Engineering
Advances in Verification - Workshop at BMS College of EngineeringRamdas Mozhikunnath
 
System verilog assertions (sva) ( pdf drive )
System verilog assertions (sva) ( pdf drive )System verilog assertions (sva) ( pdf drive )
System verilog assertions (sva) ( pdf drive )sivasubramanian manickam
 
2019 2 testing and verification of vlsi design_verification
2019 2 testing and verification of vlsi design_verification2019 2 testing and verification of vlsi design_verification
2019 2 testing and verification of vlsi design_verificationUsha Mehta
 
The Verification Methodology Landscape
The Verification Methodology LandscapeThe Verification Methodology Landscape
The Verification Methodology LandscapeDVClub
 
Cracking Digital VLSI Verification Interview: Interview Success
Cracking Digital VLSI Verification Interview: Interview SuccessCracking Digital VLSI Verification Interview: Interview Success
Cracking Digital VLSI Verification Interview: Interview SuccessRamdas Mozhikunnath
 
verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020Sameh El-Ashry
 
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...Sameh El-Ashry
 

What's hot (20)

System verilog coverage
System verilog coverageSystem verilog coverage
System verilog coverage
 
system verilog
system verilogsystem verilog
system verilog
 
Verification challenges and methodologies - SoC and ASICs
Verification challenges and methodologies - SoC and ASICsVerification challenges and methodologies - SoC and ASICs
Verification challenges and methodologies - SoC and ASICs
 
System verilog control flow
System verilog control flowSystem verilog control flow
System verilog control flow
 
SystemVerilog based OVM and UVM Verification Methodologies
SystemVerilog based OVM and UVM Verification MethodologiesSystemVerilog based OVM and UVM Verification Methodologies
SystemVerilog based OVM and UVM Verification Methodologies
 
System Verilog Functional Coverage
System Verilog Functional CoverageSystem Verilog Functional Coverage
System Verilog Functional Coverage
 
System verilog important
System verilog importantSystem verilog important
System verilog important
 
Uvm dac2011 final_color
Uvm dac2011 final_colorUvm dac2011 final_color
Uvm dac2011 final_color
 
UVM TUTORIAL;
UVM TUTORIAL;UVM TUTORIAL;
UVM TUTORIAL;
 
Verification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career PathVerification Engineer - Opportunities and Career Path
Verification Engineer - Opportunities and Career Path
 
ASIC design verification
ASIC design verificationASIC design verification
ASIC design verification
 
Basics of Functional Verification - Arrow Devices
Basics of Functional Verification - Arrow DevicesBasics of Functional Verification - Arrow Devices
Basics of Functional Verification - Arrow Devices
 
Advances in Verification - Workshop at BMS College of Engineering
Advances in Verification - Workshop at BMS College of EngineeringAdvances in Verification - Workshop at BMS College of Engineering
Advances in Verification - Workshop at BMS College of Engineering
 
System verilog assertions (sva) ( pdf drive )
System verilog assertions (sva) ( pdf drive )System verilog assertions (sva) ( pdf drive )
System verilog assertions (sva) ( pdf drive )
 
2019 2 testing and verification of vlsi design_verification
2019 2 testing and verification of vlsi design_verification2019 2 testing and verification of vlsi design_verification
2019 2 testing and verification of vlsi design_verification
 
Formal verification
Formal verificationFormal verification
Formal verification
 
The Verification Methodology Landscape
The Verification Methodology LandscapeThe Verification Methodology Landscape
The Verification Methodology Landscape
 
Cracking Digital VLSI Verification Interview: Interview Success
Cracking Digital VLSI Verification Interview: Interview SuccessCracking Digital VLSI Verification Interview: Interview Success
Cracking Digital VLSI Verification Interview: Interview Success
 
verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020verification_planning_systemverilog_uvm_2020
verification_planning_systemverilog_uvm_2020
 
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...
Efficient Methodology of Sampling UVM RAL During Simulation for SoC Functiona...
 

Similar to ASIC SoC Verification Challenges and Methodologies

Testing of Object-Oriented Software
Testing of Object-Oriented SoftwareTesting of Object-Oriented Software
Testing of Object-Oriented SoftwarePraveen Penumathsa
 
Dream QA: Designing the QA team where we'd love to work
Dream QA: Designing the QA team where we'd love to workDream QA: Designing the QA team where we'd love to work
Dream QA: Designing the QA team where we'd love to workManuel de la Peña Peña
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...Lionel Briand
 
Model Driven Testing: requirements, models & test
Model Driven Testing: requirements, models & test Model Driven Testing: requirements, models & test
Model Driven Testing: requirements, models & test Gregory Solovey
 
modelling-and-simulation-made-easy-with-simulink.pdf
modelling-and-simulation-made-easy-with-simulink.pdfmodelling-and-simulation-made-easy-with-simulink.pdf
modelling-and-simulation-made-easy-with-simulink.pdfGBBarrios
 
Discrete event simulation
Discrete event simulationDiscrete event simulation
Discrete event simulationssusera970cc
 
Introduction to networks simulation
Introduction to networks simulationIntroduction to networks simulation
Introduction to networks simulationahmed L. Khalaf
 
Generating test cases using UML Communication Diagram
Generating test cases using UML Communication Diagram Generating test cases using UML Communication Diagram
Generating test cases using UML Communication Diagram Praveen Penumathsa
 
Modeling & simulation in projects
Modeling & simulation in projectsModeling & simulation in projects
Modeling & simulation in projectsanki009
 
Modeling&Simulation_Ch01_part 3.pptx
Modeling&Simulation_Ch01_part 3.pptxModeling&Simulation_Ch01_part 3.pptx
Modeling&Simulation_Ch01_part 3.pptxMaiGaafar
 
Leave Product Development to the Dummies
Leave Product Development to the DummiesLeave Product Development to the Dummies
Leave Product Development to the Dummiesdavidcolls
 
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...Deltares
 
Pitfalls of machine learning in production
Pitfalls of machine learning in productionPitfalls of machine learning in production
Pitfalls of machine learning in productionAntoine Sauray
 
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010Darius Silingas - From Model-Driven Testing - EuroSTAR 2010
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010TEST Huddle
 
How to test a Mainframe Application
How to test a Mainframe ApplicationHow to test a Mainframe Application
How to test a Mainframe ApplicationMichael Erichsen
 

Similar to ASIC SoC Verification Challenges and Methodologies (20)

Report simulation
Report simulationReport simulation
Report simulation
 
Dill may-2008
Dill may-2008Dill may-2008
Dill may-2008
 
Testing of Object-Oriented Software
Testing of Object-Oriented SoftwareTesting of Object-Oriented Software
Testing of Object-Oriented Software
 
Dream QA: Designing the QA team where we'd love to work
Dream QA: Designing the QA team where we'd love to workDream QA: Designing the QA team where we'd love to work
Dream QA: Designing the QA team where we'd love to work
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
 
Model Driven Testing: requirements, models & test
Model Driven Testing: requirements, models & test Model Driven Testing: requirements, models & test
Model Driven Testing: requirements, models & test
 
modelling-and-simulation-made-easy-with-simulink.pdf
modelling-and-simulation-made-easy-with-simulink.pdfmodelling-and-simulation-made-easy-with-simulink.pdf
modelling-and-simulation-made-easy-with-simulink.pdf
 
Discrete event simulation
Discrete event simulationDiscrete event simulation
Discrete event simulation
 
Robotics
RoboticsRobotics
Robotics
 
Introduction to networks simulation
Introduction to networks simulationIntroduction to networks simulation
Introduction to networks simulation
 
Generating test cases using UML Communication Diagram
Generating test cases using UML Communication Diagram Generating test cases using UML Communication Diagram
Generating test cases using UML Communication Diagram
 
Modeling & simulation in projects
Modeling & simulation in projectsModeling & simulation in projects
Modeling & simulation in projects
 
Modeling&Simulation_Ch01_part 3.pptx
Modeling&Simulation_Ch01_part 3.pptxModeling&Simulation_Ch01_part 3.pptx
Modeling&Simulation_Ch01_part 3.pptx
 
Leave Product Development to the Dummies
Leave Product Development to the DummiesLeave Product Development to the Dummies
Leave Product Development to the Dummies
 
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
 
Pragmatic Code Coverage
Pragmatic Code CoveragePragmatic Code Coverage
Pragmatic Code Coverage
 
Design Verification
Design VerificationDesign Verification
Design Verification
 
Pitfalls of machine learning in production
Pitfalls of machine learning in productionPitfalls of machine learning in production
Pitfalls of machine learning in production
 
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010Darius Silingas - From Model-Driven Testing - EuroSTAR 2010
Darius Silingas - From Model-Driven Testing - EuroSTAR 2010
 
How to test a Mainframe Application
How to test a Mainframe ApplicationHow to test a Mainframe Application
How to test a Mainframe Application
 

More from Dr. Shivananda Koteshwar

Role of a manager in cultural transformation
Role of a manager in cultural transformationRole of a manager in cultural transformation
Role of a manager in cultural transformationDr. Shivananda Koteshwar
 
Innovation in GCC - Global Capability Center
Innovation in GCC - Global Capability CenterInnovation in GCC - Global Capability Center
Innovation in GCC - Global Capability CenterDr. Shivananda Koteshwar
 
Introduction to consultancy for MBA Freshers
Introduction to consultancy for MBA FreshersIntroduction to consultancy for MBA Freshers
Introduction to consultancy for MBA FreshersDr. Shivananda Koteshwar
 
Understanding scale Clean tech and Agritech verticals
Understanding scale   Clean tech and Agritech verticalsUnderstanding scale   Clean tech and Agritech verticals
Understanding scale Clean tech and Agritech verticalsDr. Shivananda Koteshwar
 
IoT product business plan creation for entrepreneurs and intrepreneurs
IoT product business plan creation for entrepreneurs and intrepreneursIoT product business plan creation for entrepreneurs and intrepreneurs
IoT product business plan creation for entrepreneurs and intrepreneursDr. Shivananda Koteshwar
 

More from Dr. Shivananda Koteshwar (20)

Aurinko Open Day (11th and 12th)
Aurinko Open Day (11th and 12th)Aurinko Open Day (11th and 12th)
Aurinko Open Day (11th and 12th)
 
Aurinko Open Day (Pre KG to 10th Grade)
Aurinko Open Day (Pre KG to 10th Grade)Aurinko Open Day (Pre KG to 10th Grade)
Aurinko Open Day (Pre KG to 10th Grade)
 
BELAKUBE METHODOLOGY
BELAKUBE METHODOLOGYBELAKUBE METHODOLOGY
BELAKUBE METHODOLOGY
 
Belakoo Annual Report 2021-22
Belakoo Annual Report 2021-22Belakoo Annual Report 2021-22
Belakoo Annual Report 2021-22
 
Role of a manager in cultural transformation
Role of a manager in cultural transformationRole of a manager in cultural transformation
Role of a manager in cultural transformation
 
Social Entrepreneurship
Social EntrepreneurshipSocial Entrepreneurship
Social Entrepreneurship
 
Innovation in GCC - Global Capability Center
Innovation in GCC - Global Capability CenterInnovation in GCC - Global Capability Center
Innovation in GCC - Global Capability Center
 
Corporate Expectation from a MBA Graduate
Corporate Expectation from a MBA GraduateCorporate Expectation from a MBA Graduate
Corporate Expectation from a MBA Graduate
 
Introduction to consultancy for MBA Freshers
Introduction to consultancy for MBA FreshersIntroduction to consultancy for MBA Freshers
Introduction to consultancy for MBA Freshers
 
Bachelor of Design (BDes)
Bachelor of Design (BDes)Bachelor of Design (BDes)
Bachelor of Design (BDes)
 
Understanding scale Clean tech and Agritech verticals
Understanding scale   Clean tech and Agritech verticalsUnderstanding scale   Clean tech and Agritech verticals
Understanding scale Clean tech and Agritech verticals
 
Evolution and Advancement in Chipsets
Evolution and Advancement in ChipsetsEvolution and Advancement in Chipsets
Evolution and Advancement in Chipsets
 
Ideation and validation - An exercise
Ideation and validation -  An exerciseIdeation and validation -  An exercise
Ideation and validation - An exercise
 
IoT product business plan creation for entrepreneurs and intrepreneurs
IoT product business plan creation for entrepreneurs and intrepreneursIoT product business plan creation for entrepreneurs and intrepreneurs
IoT product business plan creation for entrepreneurs and intrepreneurs
 
IoT Product Design and Prototyping
IoT Product Design and PrototypingIoT Product Design and Prototyping
IoT Product Design and Prototyping
 
Business model
Business modelBusiness model
Business model
 
Engaging Today's kids
Engaging Today's kidsEngaging Today's kids
Engaging Today's kids
 
Nurturing Innovative Minds
Nurturing Innovative MindsNurturing Innovative Minds
Nurturing Innovative Minds
 
Creating those dots
Creating those dotsCreating those dots
Creating those dots
 
Future of Work - Impact on HR
Future of Work - Impact on HRFuture of Work - Impact on HR
Future of Work - Impact on HR
 

Recently uploaded

Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationAadityaSharma884161
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxsqpmdrvczh
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 

Recently uploaded (20)

Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint Presentation
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptx
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 

ASIC SoC Verification Challenges and Methodologies

  • 1. Dr. Shivananda (Shivoo) R Koteshwar Design Group Digital Implementation Site Leader and R&D Head Synopsys India LINKEDIN : https://in.linkedin.com/in/shivoo2life FACEBOOK : shivoo.koteshwar SLIDESHARE : www.slideshare.net/shivoo.koteshwar Dayananda College, Bengaluru March 2019
  • 2. 1. Basics 2. Verification Challenges 3. VerificationTechnologies 4. Verification Strategies 5. Verification Methodologies 6. Skills needed for today’s corporate job 7. Q&A
  • 3. • Design synthesis: ▪ Given an I/O function, develop a procedure to manufacture a device using known materials and processes • Verification: ▪ Predictive analysis to ensure that the synthesized design, when manufactured, will perform the given I/O function • Test: ▪ A manufacturing step that ensures that the physical device, manufactured from the synthesized design, has no manufacturing defect. 3
  • 4. 4
  • 5.  Goal:Validate a model of the design  Testbench wraps around the design under test (DUT)  Inputs provide (deterministic or random) stimulus ▪ Reference signals: clock(s), reset, etc. ▪ Data: bits, bit words ▪ Protocols: PCI, SPI,AMBA, USB, etc.  Outputs capture responses and make checks ▪ Data: bits, bit words ▪ Protocols: PCI, SPI,AMBA, USB, etc. Design Under Test Testbench inputs outputs
  • 6. Verification is the process of verifying the transformation steps in the design flow are executed correctly. Algorithm Architecture/ Spec RTL Gate GDSII ASIC End productIdea Product Acceptance Test Transformations C-Model Spec Acceptance Review Simulation/ Code Review Formal Functional/ Timing Verification ATE Sign-Off Review
  • 7.  Ensure full conformance with specification: ▪ Must avoid false positives (untested functionalities) ???Pass Fail Good Bad(bug) RTL code Tape out! Debug testbench Debug RTL code Testbench Simulation result False positive results in shipping a bad design How do we achieve this goal?
  • 8.
  • 9.  Simulators are the most common and familiar verification tools. They are named simulators because their role is limited to approximating reality.  A simulation is never the final goal of a project. The goal of all hardware design projects is to create real physical designs that can be sold and generate profits.  Simulators attempt to create an artificial universe that mimics the future real design. This lets the designers interact with the design before it is manufactured and correct flaws and problems earlier  Simulators are only approximations of reality ▪ Many physical characteristics are simplified - or even ignored - to ease the simulation task. For example, a digital simulator assumes that the only possible values for a signal are ‘0’, ‘1’, X, and Z. However, in the physical and analog world, the value of a signal is a continuous: an infinite number of possible values. In a discrete simulator, events that happen deterministically 5 ns apart may be asynchronous in the real world and may occur randomly  Simulators are at the mercy of the descriptions being simulated ▪ The description is limited to a well-defined language with precise semantics. If that description does not accurately reflect the reality it is trying to model, there is no way for you to know that you are simulating something that is different from the design that will be ultimately manufactured. Functional correctness and accuracy of models is a big problem as errors cannot be proven not to exist. 9
  • 10.  Simulation requires stimulus ▪ Simulators are not static tools. A static verification tool performs its task on the design without any additional information or action required by the user. For example, linting tools are static tools. Simulators, on the other hand, require that you provide a facsimile of the environment in which the design will find itself. This facsimile is often called a testbench, stimulus. ▪ The testbench needs to provide a representation of the inputs observed by the design, so the simulator can emulate the design’s responses based on its description  The simulation outputs are validated externally, against design intents. ▪ The other thing that you must not forget is that simulators have no knowledge of your intentions. They cannot determine if a design being simulated is correct. Correctness is a value judgment on the outcome of a simulation that must be made by you, the designer. ▪ Once the design is submitted to an approximation of the inputs from its environment, your primary responsibility is to examine the outputs produced by the simulation of the design’s description and determine if that response is appropriate. 10
  • 11.  Simulators are never fast enough ▪ They are attempting to emulate a physical world where electricity travels at the speed of light and transistors switch over one billion times in a second. Simulators are implemented using general purpose computers that can execute, under ideal conditions, up to 100 million instructions per second ▪ The speed advantage is unfairly and forever tipped in favor of the physical world  Outputs change only when an input changes ▪ One way to optimize the performance of a simulator is to avoid simulating something that does not need to be simulated. ▪ Figure shows a 2-input XOR gate. In the physical world, if the inputs do not change (a), even though voltage is constantly applied, output does not change Only if one of the inputs change (b) does the output change  Change in values, called events, drive the simulation process ▪ The simulator could choose to continuously execute this model, producing the same output value if the input values did not change. ▪ An opportunity to improve upon that simulator’s performance becomes obvious: do not execute the model while the inputs are constants. Phrased another way: only execute a model when an input changes. The simulation is therefore driven by changes in inputs. If you define an input change as an event, you now have an event-driven simulator 11
  • 12.  Cycle-based simulations have no timing information ▪ This great improvement in simulation performance comes at a cost: all timing and delay information is lost. Cycle-based simulators assume that the entire design meets the set-up and hold requirements of all the flip-flops. ▪ When using a cycle-based simulator, timing is usually verified using a static timing analyzer  Cycle-based simulators can only handle synchronous circuits ▪ Cycle-based simulators further assume that the active clock edge is the only significant event in changing the state of the design. All other inputs are assumed to be perfectly synchronous with the active clock edge.Therefore, cycle-based simulators can only simulate perfectly synchronous designs ▪ Anything containing asynchronous inputs, latches, or multiple-clock domains cannot be simulated accurately.,The same restrictions apply to static timing analysis.Thus, circuits that are suitable for cycle-based simulation to verify the functionality, are suitable for static timing verification to verify the timing 12
  • 13.  To handle the portions of a design that do not meet the requirements for cycle-based simulation, most simulators are integrated with an event-driven simulator  As shown, the synchronous portion of the design is simulated using the cycle-based algorithm, while the remainder of the design is simulated using a conventional event-driven simulator  Both simulators (event-driven and cycle-based) are running together, cooperating to simulate the entire design ◼ Other popular co-simulation environments provide VHDL and Verilog, HDL and C, or digital and analog co- simulation 13
  • 14. Design Errors Simulation –Practical Problem
  • 15.  Coverage ▪ Code Coverage ▪ Statement or Block Coverage ▪ Path Coverage ▪ Expression Coverage ▪ Functional Coverage  Verification languages can raise the level of abstraction  Best way to increase productivity is to raise the level of abstraction used to perform a task  VHDL andVerilog are simulation languages, not verification languages
  • 16.  VHDL simulation tools can automatically calculate a metric called code coverage (assuming you have licenses for this feature).  Code coverage tracks what lines of code or expressions in the code have been exercised.  Code coverage cannot detect conditions that are not in the code  Code coverage on a partially implemented design can reach 100%. It cannot detect missing features and many boundary conditions (in particular those that span more than one block)  Code coverage is an optimistic metric. In combinational logic code in an HDL, a process may be executed many times during a given clock cycle due to delta cycle changes on input signals. This can result in several different branches of code being executed. However, only the last branch of code executed before the clock edge truly has been covered.  Hence, code coverage cannot be used exclusively to indicate we are done testing. 16
  • 17.  Functional coverage is code that observes execution of a test plan. As such, it is code you write to track whether important values, sets of values, or sequences of values that correspond to design or interface requirements, features, or boundary conditions have been exercised  Specifically, 100% functional coverage indicates that all items in the test plan have been tested. Combine this with 100% code coverage and it indicates that testing is done  Functional coverage that examines the values within a single object is called either point coverage or item coverage ▪ One relationship we might look at is different transfer sizes across a packet based bus. For example, the test plan may require that transfer sizes with the following size or range of sizes be observed: 1, 2, 3, 4 to 127, 128 to 252, 253, 254, or 255.  Functional coverage that examines the relationships between different objects is called cross coverage. An example of this would be examining whether an ALU has done all of its supported operations with every different input pair of registers  VHDL’s Open SourceVHDLVerification Methodology (OSVVM) provides a package, CoveragePkg, with a protected type that facilitates capturing the data structure and writing functional coverage  Functional Coverage provides additional supporting data that the design is tested. It’s a supplement to Primitive testing directed, algorithmic, file based, or constrained random test methods 17
  • 18.  Completeness does not imply correctness: ▪ Code coverage indicates how thoroughly your entire verification suite exercises the source code. I does not provide an indication, in any way, about the correctness of the verification suite ▪ Code coverage should be used to help identify corner cases that were not exercised by the verification suite or implementation-dependent features that were introduced during the implementation ▪ Code coverage is an additional indicator for the completeness of the verification job. It can help increase your confidence that the verification job is complete, but it should not be your only indicator.  Code coverage lets you know if you are not done: Code coverage indicates if the verification task is not complete through low coverage numbers. A high coverage number is by no means an indication that the job is over  Some tools can help you reach 100% coverage: There are testbench generation tools that automatically generate stimulus to exercise the uncovered code sections of a design  Code coverage tools can be used as profilers:When developing models for simulation only, where performance is an important criteria, code coverage tools can be used for profiling.The aim of profiling is the opposite of code coverage. The aim of profiling is to identify the lines of codes that are executed most often.These lines of code become the primary candidates for performance optimization efforts 18
  • 19.  It is quite possible to achieve 100% code coverage but only 50% functional coverage ▪ Here the design is half complete  Equally, it is possible to have 50% code coverage but 100% functional coverage ▪ Indicates that the functional coverage model is missing some key features of the design ▪ Indicates the design contains untested code that is not part of the test plan ▪ This can come from an incomplete test plan, extra undocumented features in the design, or case statement others branches that do not get exercised in normal hardware operation ▪ Untested features need to either be tested or removed ▪ As a result, even with 100% functional coverage it is still a good idea to use code coverage as a fail safe for the test plan  Code Coverage is quantitative coverage and functional coverage is qualitative coverage  The two coverage approaches are complementary, and high quality verification will benefit from both. 19 Test Done =Test Plan Executed and All Code Executed REF: https://www.doulos.com/knowhow/sysverilog/uvm/easier_uvm_guidelines/coverage-driven
  • 20. 20
  • 21.  IP / Module LevelVerification Study DUT and related Specification Gather requirements for features to be verified and set priorities Review Requirements with IP Architect/Designer (Requirements should cover all parameters for module) Design Test Infrastructure on paper / document (includes re-usable verification components) Review TB Architecture with Verification Team Design Test Infrastructure (includes re-usable verification components) Code Testcases as per Test-Bench Plan. Also code Functional Coverpoints / Assertions Complete Verification such that Functional Coverage 100% and Code Coverage numbers are logged. Review Code Coverage numbers with Designer to eliminate dead code possibilities. Sign off Module Level Verification by checking in the files having relevant data such as logs.
  • 22.  SoC LevelVerification Study SoC and related Specification Gather requirements for critical data paths set priorities Review Requirements with IP Architect/Designer (Requirements should cover all parameters for module) Design Test Infrastructure on paper / document Identify testcases that can be re-used (includes re-usable verification components) Review TB Architecture with Verification Team Design Test Infrastructure (includes re-usable verification components) Code Testcases as per Test-Bench Plan. Also code Functional Coverpoints / Assertions Complete Verification such that Functional Coverage 100% and Code Coverage numbers are logged. Review Code Coverage numbers with Designer to eliminate dead code possibilities. Sign off SoC Verification by checking in the files having relevant data such as logs.
  • 24.  Accelera Systems Initiative is an independent, not-for profit organization dedicated to create, support, promote, and advance system-level design, modeling, and verification standards for use by the worldwide electronics industry  www.accelera.org 24
  • 25.  Verification languages can raise the level of abstraction ▪ Best way to increase productivity is to raise the level of abstraction used to perform a task  VHDL andVerilog are simulation languages, not verification languages ▪ Verilog was designed with a focus on describing low-level hardware structures. It does not provide support for high-level data structures or object-oriented features ▪ VHDL was designed for very large design teams. It strongly encapsulates all information and communicates strictly through well-defined interfaces ▪ Very often, these limitations get in the way of an efficient implementation of a verification strategy. Neither integrates easily with C models ▪ This creates an opportunity for verification languages designed to overcome the shortcomings ofVerilog andVHDL. However, using verification language requires additional training and tool costs  Proprietary verification languages exist ▪ e/Specman fromVerisity, VERA from Synopsys, Rave from Chronology etc 25
  • 26.  Provides a reusable ,standard infrastructure in form of base classes which are pre-defined.These can be extended and enhanced as per user needs  Defines rules to create behavioral models also known as Verification Components (OVC/UVC)  Defines standards for higher level of modelling input stimulus usingTransaction Level Modelling (TLM)  Defines rules to have a layered structure of test-benches  In summary Methodology = standardization of the way of creating complex test-benches with constrained random test- vectors.
  • 27.  OVM ▪ OpenVerification Methodology ▪ Derived mainly from the URM (Universal Reuse Methodology) which was, to a large part, based on the eRM (e Reuse Methodology) for the e Verification Language developed byVerisity Design in 2001 ▪ The OVM also brings in concepts from the AdvancedVerification Methodology (AVM) ▪ SystemVerilog  RVM ▪ Reference Verification Methodology ▪ Complete set of metrics and methods for performing Functional verification of complex designs ▪ The SystemVerilog implementation of the RVM is known as theVMM (Verification Methodology Manual)  OVL ▪ OpenVerification Language ▪ OVL library of assertion checkers is intended to be used by design, integration, and verification engineers to check for good/bad behavior in simulation, emulation, and formal verification. ▪ Accellera - http://www.accellera.org/downloads/standards/ovl/  UVM ▪ Standard Universal Verification Methodology ▪ Accellera - http://www.accellera.org/downloads/standards/uvm ▪ SystemVerilog  OS-VVM ▪ VHDL ▪ Accellera OVC: OVMVerification Component UVC: UniversalVerification Component
  • 28.  C type data types like int, typedef, struct, union, enum  Dynamic data types : struct, classes, dynamic queues, dynamic arrays  New operators and built in methods  Enhanced flow control like, foreach, return, break, continue  Inter-process synchronization – Semaphores, Mailboxes, Event Extension  Assertions and Coverage  Clocking Domains  Direct Programming Interface (DPI) -VPI  Hardware specific procedures REF: http://www.eetimes.com/document.asp?doc_id=1277143
  • 29.  UVM (UniversalVerification Methodology) is a SystemVerilog language basedVerification methodology  UVM consists of a defined methodology for architecting modular testbenches for design verification.  UVM has a library of classes that helps in designing and implementing modular testbench components and stimulus.This enables re-using testbench components and stimulus within and across projects, development ofVerification IP, easier migration from simulation to emulation etc.  Relies on strong, proven industry foundations .The core of its success is adherence to a standard (i.e. architecture, stimulus creation, automation, factory usage standards etc.) 29
  • 30.  Following can be automated using UVM ▪ Coverage DrivenVerification (CDV) environments ▪ Automated Stimulus Generation ▪ Independent Checking ▪ Coverage Collection 30
  • 32. • Syntax • RTL • OOP • Class • Interface System Verilog Language • Constrained Random • Coverage Driven • Transaction Level • Sequences • Scoreboards Verification Concepts • Base Classes • Use Cases • Configuration-db • Phases Methodology 32
  • 33.  SystemVerilog Language syntax & semantics are pre-requisite  All SystemVerilog experience is directly relevant for UVM (design/RTL, AVM,VMM, etc.)  But be aware the verification part of language is much bigger than that used for design! ▪ Design: RTL, Blocks, Modules,Vectors,Assignments,Arrays etc. ▪ Verification: Signals, Interfaces Clocking-block, scheduling, functions, tasks, OOP, class, random constraint coverage, queues etc.  All verification experience is directly transferrable to UVM 33
  • 34. 34
  • 35.  Modularity and Reusability –The methodology is designed as modular components (Driver, Sequencer, Agents , env etc) to enable re-use at different levels of verification and across projects  Separating Tests fromTestbenches –Tests in terms of stimulus/sequencers are kept separate from the actual testbench hierarchy and hence there can be reuse of stimulus across different units or across projects  Simulator independent –The base class library and the methodology is supported by all simulators and hence there is no dependence on any specific simulator  Sequence based Stimulus generation: There are several ways in which sequences can be developed which includes randomization, layered sequences, virtual sequences etc which provides a good control and rich stimulus generation capability.  Configuration mechanisms simplify configuration of objects with deep hierarchy. The configuration mechanism (using UVM config data base) helps in easily configuring different testbench components based upon verification environment using it, and without worrying about how deep any component is in the testbench hierarchy  Factory mechanisms simplifies modification of components easily. Creating each components using factory enables them to be overridden in different tests or environments without changing underlying code base. 35
  • 36.  Steep learning curve: For anyone new to the methodology, the learning curve to understand all details and the library is very steep.  Still developing and not perfect/stable: The methodology is still developing and has a lot of overhead that can sometimes cause simulation to appear slow or probably can have some bugs 36
  • 37. 37 ReferenceVerification Methodology E Reuse Methodology Universal Reuse Methodology AdvancedVerification Methodology Verification Methodology Manual OpenVerification Methodology 37
  • 38.  Run the most important tests first when you get a new build  Do not start over on your test pass every time you receive a new build  Regression tests that have been run already many times are unlikely to reveal new bugs. If your testcase fully automated, by all means, run all of them for each build.  Prioritize tests into “Must-Pass” types with a more focused list of tests which can reduce the time of the regression. Major builds will warrant running all testcases.  Automate whenever it makes sense to do so.
  • 39.  Automation is a means of reducing manual effort in running repetitive tasks such as regressions.  Automation can be done also in creating test-benches so that a standard infrastructure is maintained across the team.  This can be done using PERL scripts.  Why use PERL ? - Free and works with most UNIX and Linux versions - Ease to work with , smaller learning curve. - Advance PERL with OOPs available makes scripting easier
  • 40.  Scripting ▪ PERL, Python, C++  Languages and Methodologies ▪ Verilog,VHDL, SystemVerilog, UVM  Problem Solving and debugging Skills  Diligent and Methodical  Documentation Skills  Reading Skills!  Be up to date on standards and adjacent technologies  Don’t be a generalist ..Be a specialist!  Assess yourself : http://www.slideshare.net/RamdasMozhikunnath/exercises-on-advances- in-verification-methodologies
  • 41. 50 Shades of Life 50 Colours of Love
  • 42.  Belakoo Education Trust offers free quality education for underprivileged children. We run STEAM programs for Government School kids substituting their learnings at school.  We participate in Skill Development Program for students under various running central/stage level schemes https://www.facebook.com/belakootrust/
  • 43. All pictures are from flickr.com with either copyright or with common creatives
  • 44. Visit my slideshare to view all these presentations
  • 45. Dr. Shivananda (Shivoo) R Koteshwar Group Director, Synopsys LINKEDIN: https://in.linkedin.com/in/shivoo2life shivoo.koteshwar@gmail.com/ Facebook: shivoo.koteshwar SLIDESHARE: www.slideshare.net/shivoo.koteshwar