SlideShare a Scribd company logo
1 of 12
Download to read offline
Formal Coverage Analysis: concepts and practicalities
Sergio Marchese
HUAWEI Technologies
Bristol, UK
www.huawei.com
ABSTRACT
Functional verification of digital designs requires planning, implementing and collecting coverage
metrics. Constrained-random stimuli generators can deliver high coverage figures, but the step to
achieve sign-off invariably involves significant engineering effort to review coverage holes, assess if they
can be waived or create additional tests to hit them. Coverage closure is a critical task as corner case
bugs often hide around hard-to-hit coverage targets. Effort might be wasted in manually waiving targets
that are expected to be unreachable, for example because of parameters values or coding style, or trying
to write tests to hit targets that are thought to be reachable but are in fact unreachable. Formal
technology can automatically analyse coverage holes and detect unreachable targets, making the
coverage closure process more robust and efficient. This paper presents underlying concepts, best
practices and issues that need to be considered when adopting a formal coverage analysis flow.
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
Table of Contents
1. Introduction.......................................................................................................................................................................... 3
2. Concepts................................................................................................................................................................................. 4
2.1 Simulation-based Verification.........................................................................................................................4
2.2 Formal Verification............................................................................................................................................... 5
2.3 Interoperability..................................................................................................................................................... 5
2.4 Coverage Closure................................................................................................................................................... 6
3. Practicalities......................................................................................................................................................................... 7
3.1 The DUT.................................................................................................................................................................... 7
3.2 VC Static TCL Script.............................................................................................................................................. 7
3.3 Issues and Pitfalls................................................................................................................................................. 8
3.4 Formal Functional Coverage Analysis........................................................................................................10
3.5 Results.................................................................................................................................................................... 11
4. Summary and Conclusions........................................................................................................................................... 12
5. References........................................................................................................................................................................... 12
Table of Figures
Figure 1: Synopsys FCA Flow.............................................................................................................................................. 7
Page 2
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
1. Introduction
Functional verification of digital hardware designs requires careful planning and progress tracking
that in turn largely relies on identifying and computing key metrics. Coverage metrics in particular
are critical to assess the quality of the verification environment and its bug-detection capability. One
definition of the word coverage is: the extent to which something deals with something else [1].
Coverage measures the extent to which a verification environment deals with its correspondent
device under test (DUT).
In the context of this paper, the DUT is a digital circuit modeled using a hardware description
language (HDL) at register-transfer level (RTL).
Simulation-based verification is by far the main technology used by design teams to detect
functional bugs and achieve coverage goals. A typical state-of-the-art verification environment will
have a constrained-random stimuli generator capable of stressing the DUT and drive it into both
targeted and unforeseen legal scenarios. Nevertheless, complex designs typically have many
functionalities that are hard to reach, leading to holes in coverage targets. Biasing the random
stimuli generator or writing directed tests is necessary to hit these targets. In some cases though, it
might prove impossible to find an input stimulus reaching an uncovered target. These unreachable
targets have to be reviewed and can be waived if judged to be expected.
The main challenge with this task, often referred to as coverage closure, is that it invariably requires
significant engineering effort and a deep level of expertise about the DUT. Moreover, results tend not
to be resilient to code revisions [2].
In the past 10 years the Electronic Design Automation (EDA) industry has made substantial
progress in improving capacity and usability of formal verification (FV) tools. The development of
application layers (or Apps) targeting very specific problems with largely automated flows has
eased the adoption of FV tools. Formal coverage analysis (FCA) is one of such Apps: it is based on
the ability of formal tools to automatically analyse coverage targets and give a definite answer on
whether or not they can be reached by the DUT.
FCA can improve the coverage closure process in 2 ways. Firstly, it can automatically analyse
coverage targets and determine if they are reachable, saving engineering effort and greatly reducing
the risk of human errors. Secondly, it can run on new design revisions, eliminating the risk of relying
on results that could have been invalidated by RTL changes. It is worth noticing though that FCA
does not remove the need to review unreachable coverage targets. The challenge remains to judge
whether or not these targets are expected to be unreachable, for example due to coding style or
design parameters. Unexpected unreachable targets can point to RTL bugs, or to errors in the
specification or implementation of the targets themselves.
The first part of the paper reviews various characteristics of coverage targets and makes a case for
the use of FCA. The second part presents an actual implementation of an FCA flow using Synopsys
tools, and includes practical advice and pitfalls to be aware of. Finally, project results are reported
and conclusions drawn.
This paper assumes that the reader is familiar with state-of-the-art functional verification methods
and tools for digital hardware designs.
Page 3
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
2. Concepts
There is no single measure that can provide absolute confidence on the quality of a verification
environment [3]. This has led to the development of a large variety of coverage metrics. This section
reviews a few characteristics of coverage targets, relevant to industrial applications, and makes a
case for the use of FCA. For a formalised and comprehensive survey of coverage metrics please refer
to [3].
2.1 Simulation-based Verification
Coverage metrics for simulation-based verification are mostly well established and understood. It is
nonetheless valuable, to set the context for this paper, to draw attention on some of their key
characteristics.
Control Coverage Exhaustive examination of all legal input stimuli is not feasible. Unfortunately
this poses the risk of not exercising parts of the DUT. Bugs in DUT functionality that is not exercised
would certainly go undetected. Control (or stimuli) coverage metrics are widely used in the industry
to measure the quality of the stimuli created by the verification environment (which typically
include directed and constrained-random stimuli generators.). Popular control coverage metrics
are: line, branch, condition, finite state machine (FSM), toggle, sometimes collectively referred to as
structural coverage metrics, and functional coverage targets (for example SystemVerilog coverpoints
or cover property).
Observation Coverage A verification environment can have a powerful stimuli generator, but poor
bug-detection capability. To detect a bug the erroneous functionality has to be stimulated (or
activated) in the DUT, and its effect must propagate to an observable point. Observable points are
related to checkers able to detect a deviation of the DUT from its expected behavior. Observation
coverage targets can be implemented by introducing small changes in the design code often
referred to as mutations. In this context mutations are used to model functional bugs. A test hitting
an observation coverage target will have to stimulate the introduced bug and detect it through a
failing checker.
Automatic Coverage Several EDA tools are able to automatically generate coverage targets, purely
by a syntactic analysis of the DUT code. The Synopsys VCS simulator can generate line, branch,
condition, and toggle coverage simply by enabling compilation options. Tools can also attempt to
extract targets by analysing the semantics of the code. For example, certain coding styles can be
recognized to represent a FSM: coverage targets can then be created for FSM states and transitions.
A few tools can also automatically insert observation coverage targets, e.g. Synopsys Certitude. Very
low engineering effort and expertise is required to create and maintain these targets. High figures
for automatic coverage (e.g. 100% for code and branch coverage) are often a prerequisite for
moving to more sophisticated and effort-intensive metrics.
Manual Coverage To gain confidence on the quality of the verification environment it is often
necessary to specify and implement coverage targets. This is a time-consuming task, often involving
architects, designers, and verification engineers. The engineering effort to capture, implement and
maintain these targets across design revisions is typically very high. The process requires general
verification and specific DUT expertise. The risk of missing important targets is high as is the risk of
making implementation mistakes. High effort might be required to update these targets following
changes in the DUT specification or implementation.
Standard Coverage Coverage targets implemented in a standard language, e.g. SystemVerilog, can
be read by tools from various EDA vendors. On the other hand coverage models applied by tools to
Page 4
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
automatically extract targets are not standardised. Even a basic coverage metric as line coverage,
cannot be expected to be implemented in exactly the same way by two different EDA tools. That
means that different tools could extract different line coverage targets from the same DUT.
Moreover, the extracted targets are not translated into standard languages but saved in proprietary
databases.
Abstraction Level of Coverage Coverage targets can be defined and implemented at various
abstraction levels, including architectural, micro-architectural, interface and implementation. For
example, in the case of a CPU, one could be interested in ensuring that: specific sequences of
instructions have been executed (architectural level); certain protocol rules have been exercised at
the memory interface (interface level); specific instruction couples have been issued at the same
clock cycle (micro-architectural level); a fifo has gone through the sequence of conditions full-
almost_full - full in 3 consecutive clock cycles (implementation level).
Unreachable Coverage A coverage target may be impossible to hit. For a trivial example consider
the coverage target tie0 == 1’b1 where tie0 is a signal that is stuck at zero in the DUT. Even in an
exhaustive simulation where all legal input stimuli are examined, this target would not be hit.
Considering that exhaustive simulation is not feasible in practice, this method cannot be used to
prove that a coverage target is unreachable. Human reasoning can be used as an alternative method,
but it cannot be automated and is prone to errors. It is also worth mentioning that a typical DUT
might have many expected unreachable coverage targets.
2.2 Formal Verification
FV tools can also automatically extract coverage targets and read targets written in standard
languages. The characteristics reviewed for simulation coverage do apply to formal targets as well.
The coverage models used and the definition of what it means to hit a target might be different.
One of the key advantages of FV technology over simulation is that it exhaustively examines all input
stimuli. Control coverage targets can be processed to determine whether they are reachable or
unreachable. If the tool is able to prove that no input sequence to the DUT exists such that the target
is exercised, the coverage item can be marked as unreachable. If the tool proves that the coverage
target is reachable, it can also provide a trace showing an input sequence that drives the DUT into
the desired scenario. This is sometimes referred to as witness trace.
Input Constraints In both simulation and formal it is necessary to ensure that the DUT is not
stimulated with illegal input sequences. Illegal inputs can cause false failures, i.e. failure not due to
bugs in the DUT. Coverage targets could be unreachable because of input constraints. On the other
hand a target that is unreachable when the DUT input is left totally unconstrained will also be
unreachable if the space of legal input stimuli is restricted. Knowing if a target is unreachable
because of input constraints is valuable as it may point to errors in the constraints themselves if this
is unexpected.
2.3 Interoperability
Ideally all coverage targets should be expressed in standard semantics, and coverage metrics saved
in a standard database. This would enable interoperability between EDA vendors and verification
technologies. Coverage targets extracted by a simulator could be hit by tests running on another
simulator, or proven unreachable by a formal tool. Coverage databases could be read by yet another
tool for analysis and display. Each of these tools could belong to a different EDA vendor.
The reality is quite different. EDA vendors use proprietary databases and technology to extract
coverage targets and store metrics.
Page 5
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
The Accelera System Initiative organisation has developed the Unified Coverage Interoperability
Standard (UCIS) [4]. Currently UCIS defines a standard application interface (API) that should
enable EDA vendors and users to read and write coverage databases compliant to this standard.
This is a step in the right direction, but unfortunately UCIS adoption in the industry is still at an
early stage. Moreover, the definition and adoption of standard coverage models, even for the most
popular metrics, is even further away.
For manual coverage targets it is possible to use standard languages. In SystemVerilog coverage
targets can be implemented using the constructs coverpoint and cover property. The latter has the
advantage of being supported by FV tools.
2.4 Coverage Closure
Design teams typically achieve a decent level of coverage relatively easily. However, as they
approach their ultimate goal, increasing the coverage level often becomes very challenging and time
consuming. Targets that have not been hit need to be reviewed and the test suite improved.
Functional areas that are hard to hit are less tested and therefore present a higher risk of errors.
Formal Verification can complement simulation and speed up the coverage closure process in
various ways. Certain functional areas hard to test with simulation, could in fact be sweet spot for
FV tools and methodologies. More relevant to the scope of this paper, coverage holes that can be
read by formal tools might be proven to be unreachable, saving valuable engineering effort
compared to using human reasoning. On the other hand, for coverage targets proven to be reachable
formal tools can provide witness traces. These traces could help in working out a simulation test
able to hit the target.
The next section of the paper presents an actual implementation of a FCA flow using Synopsys tools.
Page 6
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
3. Practicalities
The implementation of a FCA flow requires a FV tool and a mean to read coverage targets. In the
case of structural coverage targets, coverage reports from the simulator can be parsed to extract
holes and generate corresponding coverage targets in a language accepted by the FV tool. This
approach has been used successfully in the past [2], but users are required to build and maintain
scripts translating information between tools.
Nowadays major EDA vendors provide ready-made solutions that require minimal implementation
effort. The Synopsys FCA flow involves three tools: VCS as simulator; VC Static as FV tool; Verdi to
display and analyse coverage results. A graphical representation of the flow is shown in Figure 1.
Figure 1: Synopsys FCA Flow
No translation is required to exchange information between tools. The coverage database generated
by VCS can be read by VC Static; the exclusion file generated by VC Static, containing information on
the coverage targets proven unreachable, can be read by Verdi.
This section presents the Synopsys FCA solution as applied to an industrial project.
3.1 The DUT
The DUT to which the FCA flow has been applied is a complex subsystem. It contains several sub-
modules and small memory structures, and is implemented in 10s of thousands of lines of
synthesisable SystemVerilog code.
3.2 VC Static TCL Script
The TCL script to drive the VC Static tool is reported below. The script is simple and reusable across
projects. Comments are included to further clarify the meaning of the main commands.
set top my_dut
set inst_path {tbench_top.subsystem.dut[0].gen}
Page 7
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
# run overnight
set_fml_var fml_max_time 12H
set_app_var fml_mode_on true
# use multiple licenses to reduce run time
# at least 5 as 4 jobs will run in parallel even using 1 license
set_grid_usage –type LSF=<int> –control <job_submission_command>
# blackbox memories to reduce complexity
set_blackbox *RAM*
# OPTIONAL: read simulation coverage results generated by VCS
read_covdb –cov_input merged_cov_database –cov_dut $inst_path.$top
# compile DUT using same options (e.g. defines) used for VCS
# no toggle coverage targets as there are too many
read_file –single_step –format sverilog –cov
line+cond+fsm_state+fms_trans –top $top –vcs “-f
vcs_compile_options.f”
# automatically setup clocks (at least attempt)
infer_clock_roots
report_clock_roots –file clk_setup.sdc
read_sdc clk_setup.sdc
# compute reset state – starting state for formal analysis
# but do not force reset to be inactive during formal analysis
sim_force {rsta_n} –apply 0
sim_force {rstb_n} –apply 0
sim_run 5
sim_save_reset
# review what is being black boxed
report_black_box –design
# sanity check on tool setup
check_fv_setup –block
report_fv_setup –list –limit 10
# run formal proofs on coverage holes – may take several hours
check_cov –block
# save results in exclusion file that can be read in Verdi
save_cov_exclusion –file proven_unreachable.el
report_cov
exit
3.3 Issues and Pitfalls
The VC Static TCL script to implement the FCA flow is simple. Nonetheless, there are number of
Page 8
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
potential issues and pitfalls that have emerged during project execution. They are shared below.
DUT Compilation It is critical that VCS and VC Static compile the same configuration of the DUT. As
shown in the script, VC Static command read_file accepts option –vcs The same compilation control
file can theoretically be used for both tools. Unfortunately though the VCS compilation is likely to
include many options (links to libraries, flags, etc.), and some of those may create errors in VC Static.
While Synopsys improves the integration between its tools, it is recommended to separate out the
options relevant to VC Static from the others when compiling for VCS.
Handling Complexity It is very likely that memories in the DUT will need to be black boxed. Black
boxing is safe as it cannot lead to reachable targets being proven unreachable (the vice versa can
happen, i.e. unreachable targets could be proven to be reachable because of black boxed
components). It is of course possible to increase the maximum runtime of the tool and the number
of licenses to the maximum available. Typically this flow will be run over the weekend and perhaps
overnight in some occasions. Using coverage results from a good test suite will also decrease the
number of uncovered targets that need to be analysed by the FV tool.
DUT Reset As seen in the VC Static TCL script (and for FV tools in general) users have to declare
DUT clocks and resets. Some FV tools might implicitly limit the behavior of reset inputs which could
lead to results showing reset initialization code proven unreachable. This behavior can be avoided
in VC Static by using the command sim_force instead of create_reset. For more details please refer to
[5].
Module-Based Exclusions The command to read the VCS coverage database is optional. If omitted,
VC Static will analyse all coverage targets. Clearly this might impact run time. Moreover, in this case
the generated exclusion file will be module-based, rather than instance-based. A module-based
exclusion file has to be used with care as it is not valid for a different configuration of the DUT (for
example a DUT instantiated with parameters values different from the ones for which the exclusion
file was computed).
Input Constraints In this project no constraints were used to restrict the DUT input stimuli. This is
a safe choice: there is no risk of proving coverage targets to be unreachable because of mistakes in
the assumptions. It is also a pragmatic choice: implementing assumptions is difficult and time
consuming. The drawback is that coverage targets proven to be reachable could in fact be
unreachable when considering only legal input stimuli.
Top Level The top level chosen in the VC Static script is the highest possible in the context of the
project. It would be possible to do multiple runs on sub-modules to reduce complexity. Results
would still be valid but it is likely that less coverage targets would be proven unreachable. This is
because the logic surrounding a sub-module has the same effect as input constraints. Running the
FCA flow on the highest possible top level is therefore the preferred choice.
Inconsistent Coverage Options Unfortunately the coverage options accepted by VCS and VC Static
are not yet fully consistent. VCS accepts the following options: line, branch, cond, fsm, tgl, and assert.
VC Static on the other hand accepts options: line, cond, tgl, fsm_state and fsm_trans.
Inconsistent Results It is possible that coverage targets proven unreachable by VC Static could in
fact be hit in a simulation test. Inconsistent results might be revealed using the following two steps.
Firstly run VC Static to generate a module-based exclusion file, thus without reading a coverage
database. Secondly read both the simulation coverage database and the exclusion file in Verdi, but
applying option –excl_strict. ($ verdi –cov –excl_strict –covdir DB_DIR –elfile proven_unreachable.el).
Verdi will report any attempt to exclude coverage targets that are marked as hit in the database.
These targets can also be saved into a file. A complete explanation of this phenomenon goes beyond
the scope of this paper. One of the reasons why this can happen is that simulators interpret the DUT
Page 9
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
HDL code according to simulation semantics. FV tools on the other hand tend to be closer to
synthesis semantics, at least in their typical default settings. Targets proven to be unreachable could
in fact be reached in simulation due to Xs or glitches. In other words, the coverage results reported
by simulators could be over optimistic. VCS compilation option –cm_glitch can help reducing this
problem (for more information, please refer to the relevant Synopsys documentation). However, for
unclear reasons, when using this option VCS is not able to collect fsm coverage. It would also be
safer to start collecting coverage information a number of cycles after reset has been released. Tests
could cover the DUT initialization functionality by re-triggering reset during normal DUT operation.
VCS does not have a coverage option to easily allow for this.
Review Results It is important to reiterate that FCA cannot judge on whether a coverage target is
expected to be unreachable or not. Coding style and DUT parameters can result in many expected
unreachable coverage targets (for a given DUT configuration). Review of FCA results is valuable
because unexpected unreachable targets could point to DUT bugs. The engineering effort necessary
to review FCA results can be reduced using several strategies. For example, strict coding guidelines
could be applied to mark expected unreachable code lines, e.g. default branches of case statements.
VC Static supports automatic detection of intentionally unreachable lines according to specific
coding guidelines. For more information please refer to the VC Static documentation. Language
features (e.g. generate statements) can also be used to reduce expected unreachable targets.
Whenever possible this should be the preferred approach as it is tool independent.
3.4 Formal Functional Coverage Analysis
The previous sections have focused on the analysis of structural coverage targets. A preliminary
flow to apply the same process to functional coverage targets has also been developed as part of this
project. Hundreds of SystemVerilog properties have been written by design and verification
engineers. Cover properties directly express coverage targets. Assume and assert properties might
also be processed by tools to automatically extract coverage targets. As an example, for a property
expressed as A |-> B, it is sensible to ensure that the coverage target A = True can be reached
by the DUT. Should the condition be unreachable, the property will never trigger in any simulation.
As already pointed out, this could be expected, for example due to the specific DUT configuration
under analysis, or could point to a DUT bug or a mistake in the property.
The TCL script to drive the VC Static tool is reported below. Once again, the script is simple and
reusable across projects. Comments are included to further clarify the meaning of the main
commands. The main differences with the previous script are: automatic generation of exclusion file
is not supported; reading of simulation coverage results is not supported.
set top my_dut
# run overnight
set_fml_var fml_max_time 12H
set_app_var fml_mode_on true
# use multiple licenses to reduce run time
# at least 5 as 4 jobs will run in parallel even using 1 license
set_grid_usage –type LSF=<int> –control <job_submission_command>
# blackbox memories to reduce complexity
set_blackbox *RAM*
Page 10
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
# compile DUT using same options (e.g. defines) used for VCS
# note use of option –sva to include properties
read_file –sva –single_step –format sverilog –top $top –vcs “-f
vcs_compile_options.f”
# automatically setup clocks (at least attempt)
infer_clock_roots
report_clock_roots –file clk_setup.sdc
read_sdc clk_setup.sdc
# compute reset state – starting state for formal analysis
# but do not force reset to be inactive during formal analysis
sim_force {rsta_n} –apply 0
sim_force {rstb_n} –apply 0
sim_run 5
sim_save_reset
# review what is being black boxed
report_black_box –design
set cover_props [get_props -type cover]
set assert_props [get_props -type assert]
set assume_props [get_props -type assume]
# be safe and turn all block level assumptions into assertions
# leave DUT inputs unconstrained
fvassert $assume_props
fvenable *
# run formal proofs
check_fv –block
# report will mark which properties are unreachable
report_fv $cover_props –list
report_fv $assert_props -list
report_fv $assume_props -list
exit
3.5 Results
The results reported refer to a specific FCA flow run done before final project delivery. The tool was
run overnight (12 hours maximum run time) using several licenses. Load Sharing Facility (LSF) was
used to distribute jobs over a Linux compute cluster. The VC Static version used was 2015.09sp1_1.
The effort required to setup the initial flow, for a user with extensive formal verification experience,
but no previous exposure to VC Static, was 2 days. A further 5 days of effort, over the course of
about 2 months, were required to unveil and address issues and improve the flow.
A simulation coverage database, derived by a suite of 2000 mixed random and directed tests, was
read before starting the formal analysis. VC Static analysed a total of 6045 line coverage targets and
11304 condition coverage targets. It could prove that 420 lines and 3195 conditions were
Page 11
Formal Coverage Analysis: Concepts and Practicalities
SNUG 2016
unreachable. No inconclusive (i.e. tool giving up) proofs were reported. In other words, all other
targets were proven to be reachable.
It was not possible to measure the engineering effort saved by applying this flow. The impact on
achieved DUT quality could not be assessed either. Nonetheless, considering the low effort
necessary to implement the FCA flow for the first time, and its high reusability across projects, there
is certainty of a positive Return On Investment (ROI). This consideration however is purely
technical and does not take into account licensing costs.
4. Summary and Conclusions
This paper reviews important aspects of coverage targets and shows how FV can contribute to the
coverage closure process. Implementing a FCA flow is relatively easy and requires low effort.
Nonetheless, there are issues and pitfalls to be aware of, as shown in section 3.3. Results can give a
valuable contribution to achieving the project’s coverage goals while saving engineering effort.
The process and tools presented are relatively mature, although improvements are still necessary.
On a more general note, the development and adoption of coverage standards by the industry would
enable smoother, more powerful and reliable FCA flows.
5. References
[1] http://www.oxforddictionaries.com/definition/english/coverage
[2] T. Blackmore, D. Halliwell, P. Barker, K. Eder and N. Ramaram, “Analysing and closing simulation
coverage by automatic generation and verification of formal properties from coverage reports”,
Integrated Formal Methods, Lecture Notes in Computer Science, Volume 7321, pp. 84-98, 2012 –
Springer
[3] H. Chockler, O. Kupferman, and M. Y. Vardi, “Coverage Metrics for Formal Verification”, Correct
Hardware Design and Verification Methods (CHARME), pp. 111–125, 2003
[4] http://www.accellera.org/activities/working-groups/ucis
[5] Difference between create_reset and sim_force commands for reset block evaluation in VC FCA, Doc
Id: 1870149, Synopsys Solvnet
Page 12

More Related Content

What's hot

Intensity Modulated Radiation Therapy (IMRT)
Intensity Modulated Radiation Therapy (IMRT)Intensity Modulated Radiation Therapy (IMRT)
Intensity Modulated Radiation Therapy (IMRT)Dilshad Kottuparamban
 
Accidents in Radiotherapy
Accidents in RadiotherapyAccidents in Radiotherapy
Accidents in RadiotherapyNileshK8
 
Plan evaluation in RADIOTHERAPY
Plan evaluation in RADIOTHERAPYPlan evaluation in RADIOTHERAPY
Plan evaluation in RADIOTHERAPYKanhu Charan
 
Radiation induced cell kill and damage
Radiation induced cell kill and damageRadiation induced cell kill and damage
Radiation induced cell kill and damageNamrata Das
 
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...Subrata Roy
 
MAY 2023 ONCOLOGY CARTOONS
MAY 2023 ONCOLOGY CARTOONSMAY 2023 ONCOLOGY CARTOONS
MAY 2023 ONCOLOGY CARTOONSKanhu Charan
 
Total body irradiation
Total body irradiationTotal body irradiation
Total body irradiationbasilpaulsunny
 
Monte carlo Technique - An algorithm for Radiotherapy Calculations
Monte carlo Technique - An algorithm for Radiotherapy CalculationsMonte carlo Technique - An algorithm for Radiotherapy Calculations
Monte carlo Technique - An algorithm for Radiotherapy CalculationsSambasivaselli R
 
4 r’s of radiobiology
4 r’s of radiobiology4 r’s of radiobiology
4 r’s of radiobiologysaikishore15
 
Magnetic Resonance Elastography
Magnetic Resonance ElastographyMagnetic Resonance Elastography
Magnetic Resonance ElastographySarah Hussein
 
chapter_16_radiation_protection_and_safety.pdf
chapter_16_radiation_protection_and_safety.pdfchapter_16_radiation_protection_and_safety.pdf
chapter_16_radiation_protection_and_safety.pdfShikhasharma187825
 
Commissioning of a Surface Guided Radiotherapy System
Commissioning of a Surface Guided Radiotherapy System Commissioning of a Surface Guided Radiotherapy System
Commissioning of a Surface Guided Radiotherapy System SGRT Community
 
Medical uses of ionising radiation
Medical uses of ionising radiationMedical uses of ionising radiation
Medical uses of ionising radiationAmin Amin
 

What's hot (20)

Intensity Modulated Radiation Therapy (IMRT)
Intensity Modulated Radiation Therapy (IMRT)Intensity Modulated Radiation Therapy (IMRT)
Intensity Modulated Radiation Therapy (IMRT)
 
Accidents in Radiotherapy
Accidents in RadiotherapyAccidents in Radiotherapy
Accidents in Radiotherapy
 
BASICS RADIOBIOLOGY FOR RADIOTHERAPY
BASICS RADIOBIOLOGY FOR RADIOTHERAPYBASICS RADIOBIOLOGY FOR RADIOTHERAPY
BASICS RADIOBIOLOGY FOR RADIOTHERAPY
 
Plan evaluation in RADIOTHERAPY
Plan evaluation in RADIOTHERAPYPlan evaluation in RADIOTHERAPY
Plan evaluation in RADIOTHERAPY
 
Cell survival curves
Cell survival curves Cell survival curves
Cell survival curves
 
Radiation induced cell kill and damage
Radiation induced cell kill and damageRadiation induced cell kill and damage
Radiation induced cell kill and damage
 
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...
Efficiency of Fiducial Tracking of Carcinoma Prostate With Cyberknife System ...
 
MAY 2023 ONCOLOGY CARTOONS
MAY 2023 ONCOLOGY CARTOONSMAY 2023 ONCOLOGY CARTOONS
MAY 2023 ONCOLOGY CARTOONS
 
Total body irradiation
Total body irradiationTotal body irradiation
Total body irradiation
 
RADIOTHERAPY CALCULATION
RADIOTHERAPY CALCULATIONRADIOTHERAPY CALCULATION
RADIOTHERAPY CALCULATION
 
Proton beam therapy
Proton beam therapy Proton beam therapy
Proton beam therapy
 
Monte carlo Technique - An algorithm for Radiotherapy Calculations
Monte carlo Technique - An algorithm for Radiotherapy CalculationsMonte carlo Technique - An algorithm for Radiotherapy Calculations
Monte carlo Technique - An algorithm for Radiotherapy Calculations
 
Osteosarcoma ppt
Osteosarcoma pptOsteosarcoma ppt
Osteosarcoma ppt
 
4 r’s of radiobiology
4 r’s of radiobiology4 r’s of radiobiology
4 r’s of radiobiology
 
PET-CT in Oncology
PET-CT in OncologyPET-CT in Oncology
PET-CT in Oncology
 
Magnetic Resonance Elastography
Magnetic Resonance ElastographyMagnetic Resonance Elastography
Magnetic Resonance Elastography
 
chapter_16_radiation_protection_and_safety.pdf
chapter_16_radiation_protection_and_safety.pdfchapter_16_radiation_protection_and_safety.pdf
chapter_16_radiation_protection_and_safety.pdf
 
I M R Tintro
I M R TintroI M R Tintro
I M R Tintro
 
Commissioning of a Surface Guided Radiotherapy System
Commissioning of a Surface Guided Radiotherapy System Commissioning of a Surface Guided Radiotherapy System
Commissioning of a Surface Guided Radiotherapy System
 
Medical uses of ionising radiation
Medical uses of ionising radiationMedical uses of ionising radiation
Medical uses of ionising radiation
 

Viewers also liked

Ajit kumar 18 july (repaired)
Ajit kumar 18 july (repaired)Ajit kumar 18 july (repaired)
Ajit kumar 18 july (repaired)Ajit kumar
 
สองถ งแปด
สองถ งแปดสองถ งแปด
สองถ งแปดbmbeam
 
ใบงานท 9-16คอม เสร_จจร_งๆ
ใบงานท  9-16คอม เสร_จจร_งๆใบงานท  9-16คอม เสร_จจร_งๆ
ใบงานท 9-16คอม เสร_จจร_งๆbmbeam
 
Leading in a digital age
Leading in a digital ageLeading in a digital age
Leading in a digital agebdempsey13
 
Epidemiology of dental caries
Epidemiology of dental cariesEpidemiology of dental caries
Epidemiology of dental cariesANAMACIAS11
 
ใบงานท 2-8 คอม나
ใบงานท   2-8 คอม나ใบงานท   2-8 คอม나
ใบงานท 2-8 คอม나bmbeam
 
情報デザインセミナー発表
情報デザインセミナー発表情報デザインセミナー発表
情報デザインセミナー発表SaitoKeitaro
 
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะ
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะงาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะ
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะbmbeam
 
ใบงานท 2
ใบงานท   2ใบงานท   2
ใบงานท 2bmbeam
 
Bab 1 introduction and review (instrumentasi)
Bab 1 introduction and review (instrumentasi)Bab 1 introduction and review (instrumentasi)
Bab 1 introduction and review (instrumentasi)Innes Annindita
 

Viewers also liked (13)

Ajit kumar 18 july (repaired)
Ajit kumar 18 july (repaired)Ajit kumar 18 july (repaired)
Ajit kumar 18 july (repaired)
 
สองถ งแปด
สองถ งแปดสองถ งแปด
สองถ งแปด
 
ใบงานท 9-16คอม เสร_จจร_งๆ
ใบงานท  9-16คอม เสร_จจร_งๆใบงานท  9-16คอม เสร_จจร_งๆ
ใบงานท 9-16คอม เสร_จจร_งๆ
 
Lab 4
Lab 4Lab 4
Lab 4
 
Leading in a digital age
Leading in a digital ageLeading in a digital age
Leading in a digital age
 
Epidemiology of dental caries
Epidemiology of dental cariesEpidemiology of dental caries
Epidemiology of dental caries
 
Planning
PlanningPlanning
Planning
 
ใบงานท 2-8 คอม나
ใบงานท   2-8 คอม나ใบงานท   2-8 คอม나
ใบงานท 2-8 คอม나
 
Blog
BlogBlog
Blog
 
情報デザインセミナー発表
情報デザインセミナー発表情報デザインセミナー発表
情報デザインセミナー発表
 
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะ
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะงาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะ
งาน 9 16 จ าอ-ง เสร_จแล_วจร_งนะ
 
ใบงานท 2
ใบงานท   2ใบงานท   2
ใบงานท 2
 
Bab 1 introduction and review (instrumentasi)
Bab 1 introduction and review (instrumentasi)Bab 1 introduction and review (instrumentasi)
Bab 1 introduction and review (instrumentasi)
 

Similar to snug_europe_2016_FCA_concepts_and_practicalities

Deployment of Debug and Trace for features in RISC-V Core
Deployment of Debug and Trace for features in RISC-V CoreDeployment of Debug and Trace for features in RISC-V Core
Deployment of Debug and Trace for features in RISC-V CoreIRJET Journal
 
Defect effort prediction models in software maintenance projects
Defect  effort prediction models in software maintenance projectsDefect  effort prediction models in software maintenance projects
Defect effort prediction models in software maintenance projectsiaemedu
 
Automated Formal Verification of SystemC/C++ High-Level Synthesis Models
Automated Formal Verification of SystemC/C++ High-Level Synthesis ModelsAutomated Formal Verification of SystemC/C++ High-Level Synthesis Models
Automated Formal Verification of SystemC/C++ High-Level Synthesis ModelsSergio Marchese
 
IRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET Journal
 
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...Formal Verification Of An Intellectual Property In a Field Programmable Gate ...
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...IRJET Journal
 
Defect effort prediction models in software
Defect effort prediction models in softwareDefect effort prediction models in software
Defect effort prediction models in softwareIAEME Publication
 
Maturity of-code-mgmt-2016-04-06
Maturity of-code-mgmt-2016-04-06Maturity of-code-mgmt-2016-04-06
Maturity of-code-mgmt-2016-04-06Bogusz Jelinski
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Agile software development and challenges
Agile software development and challengesAgile software development and challenges
Agile software development and challengeseSAT Journals
 
Requirement Analysis & Specification sharbani bhattacharya
Requirement Analysis & Specification sharbani bhattacharyaRequirement Analysis & Specification sharbani bhattacharya
Requirement Analysis & Specification sharbani bhattacharyaSharbani Bhattacharya
 
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...iosrjce
 
Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Ian McDonald
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptxjack952975
 
ST-Magnitude of three Dimensional Skill Set
ST-Magnitude of three Dimensional Skill SetST-Magnitude of three Dimensional Skill Set
ST-Magnitude of three Dimensional Skill SetAmit Bhardwaj
 
The Evolution of Digital Control Towers in Supply Chain
The Evolution of Digital Control Towers in Supply ChainThe Evolution of Digital Control Towers in Supply Chain
The Evolution of Digital Control Towers in Supply ChainTredence Inc
 
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTTHE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
 
Software testing and quality assurance
Software testing and quality assuranceSoftware testing and quality assurance
Software testing and quality assuranceTOPS Technologies
 
Industrial perspective on static analysis
Industrial perspective on static analysisIndustrial perspective on static analysis
Industrial perspective on static analysisChirag Thumar
 

Similar to snug_europe_2016_FCA_concepts_and_practicalities (20)

Deployment of Debug and Trace for features in RISC-V Core
Deployment of Debug and Trace for features in RISC-V CoreDeployment of Debug and Trace for features in RISC-V Core
Deployment of Debug and Trace for features in RISC-V Core
 
Defect effort prediction models in software maintenance projects
Defect  effort prediction models in software maintenance projectsDefect  effort prediction models in software maintenance projects
Defect effort prediction models in software maintenance projects
 
Automated Formal Verification of SystemC/C++ High-Level Synthesis Models
Automated Formal Verification of SystemC/C++ High-Level Synthesis ModelsAutomated Formal Verification of SystemC/C++ High-Level Synthesis Models
Automated Formal Verification of SystemC/C++ High-Level Synthesis Models
 
IRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous Delivery
 
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...Formal Verification Of An Intellectual Property In a Field Programmable Gate ...
Formal Verification Of An Intellectual Property In a Field Programmable Gate ...
 
Defect effort prediction models in software
Defect effort prediction models in softwareDefect effort prediction models in software
Defect effort prediction models in software
 
Maturity of-code-mgmt-2016-04-06
Maturity of-code-mgmt-2016-04-06Maturity of-code-mgmt-2016-04-06
Maturity of-code-mgmt-2016-04-06
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
Agile software development and challenges
Agile software development and challengesAgile software development and challenges
Agile software development and challenges
 
Requirement Analysis & Specification sharbani bhattacharya
Requirement Analysis & Specification sharbani bhattacharyaRequirement Analysis & Specification sharbani bhattacharya
Requirement Analysis & Specification sharbani bhattacharya
 
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
A Review on Software Fault Detection and Prevention Mechanism in Software Dev...
 
F017652530
F017652530F017652530
F017652530
 
Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Estimating test effort part 1 of 2
Estimating test effort part 1 of 2
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptx
 
ST-Magnitude of three Dimensional Skill Set
ST-Magnitude of three Dimensional Skill SetST-Magnitude of three Dimensional Skill Set
ST-Magnitude of three Dimensional Skill Set
 
The Evolution of Digital Control Towers in Supply Chain
The Evolution of Digital Control Towers in Supply ChainThe Evolution of Digital Control Towers in Supply Chain
The Evolution of Digital Control Towers in Supply Chain
 
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTTHE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENT
 
Software testing and quality assurance
Software testing and quality assuranceSoftware testing and quality assurance
Software testing and quality assurance
 
Industrial perspective on static analysis
Industrial perspective on static analysisIndustrial perspective on static analysis
Industrial perspective on static analysis
 
2009 ASME final
2009 ASME final2009 ASME final
2009 ASME final
 

snug_europe_2016_FCA_concepts_and_practicalities

  • 1. Formal Coverage Analysis: concepts and practicalities Sergio Marchese HUAWEI Technologies Bristol, UK www.huawei.com ABSTRACT Functional verification of digital designs requires planning, implementing and collecting coverage metrics. Constrained-random stimuli generators can deliver high coverage figures, but the step to achieve sign-off invariably involves significant engineering effort to review coverage holes, assess if they can be waived or create additional tests to hit them. Coverage closure is a critical task as corner case bugs often hide around hard-to-hit coverage targets. Effort might be wasted in manually waiving targets that are expected to be unreachable, for example because of parameters values or coding style, or trying to write tests to hit targets that are thought to be reachable but are in fact unreachable. Formal technology can automatically analyse coverage holes and detect unreachable targets, making the coverage closure process more robust and efficient. This paper presents underlying concepts, best practices and issues that need to be considered when adopting a formal coverage analysis flow.
  • 2. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 Table of Contents 1. Introduction.......................................................................................................................................................................... 3 2. Concepts................................................................................................................................................................................. 4 2.1 Simulation-based Verification.........................................................................................................................4 2.2 Formal Verification............................................................................................................................................... 5 2.3 Interoperability..................................................................................................................................................... 5 2.4 Coverage Closure................................................................................................................................................... 6 3. Practicalities......................................................................................................................................................................... 7 3.1 The DUT.................................................................................................................................................................... 7 3.2 VC Static TCL Script.............................................................................................................................................. 7 3.3 Issues and Pitfalls................................................................................................................................................. 8 3.4 Formal Functional Coverage Analysis........................................................................................................10 3.5 Results.................................................................................................................................................................... 11 4. Summary and Conclusions........................................................................................................................................... 12 5. References........................................................................................................................................................................... 12 Table of Figures Figure 1: Synopsys FCA Flow.............................................................................................................................................. 7 Page 2
  • 3. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 1. Introduction Functional verification of digital hardware designs requires careful planning and progress tracking that in turn largely relies on identifying and computing key metrics. Coverage metrics in particular are critical to assess the quality of the verification environment and its bug-detection capability. One definition of the word coverage is: the extent to which something deals with something else [1]. Coverage measures the extent to which a verification environment deals with its correspondent device under test (DUT). In the context of this paper, the DUT is a digital circuit modeled using a hardware description language (HDL) at register-transfer level (RTL). Simulation-based verification is by far the main technology used by design teams to detect functional bugs and achieve coverage goals. A typical state-of-the-art verification environment will have a constrained-random stimuli generator capable of stressing the DUT and drive it into both targeted and unforeseen legal scenarios. Nevertheless, complex designs typically have many functionalities that are hard to reach, leading to holes in coverage targets. Biasing the random stimuli generator or writing directed tests is necessary to hit these targets. In some cases though, it might prove impossible to find an input stimulus reaching an uncovered target. These unreachable targets have to be reviewed and can be waived if judged to be expected. The main challenge with this task, often referred to as coverage closure, is that it invariably requires significant engineering effort and a deep level of expertise about the DUT. Moreover, results tend not to be resilient to code revisions [2]. In the past 10 years the Electronic Design Automation (EDA) industry has made substantial progress in improving capacity and usability of formal verification (FV) tools. The development of application layers (or Apps) targeting very specific problems with largely automated flows has eased the adoption of FV tools. Formal coverage analysis (FCA) is one of such Apps: it is based on the ability of formal tools to automatically analyse coverage targets and give a definite answer on whether or not they can be reached by the DUT. FCA can improve the coverage closure process in 2 ways. Firstly, it can automatically analyse coverage targets and determine if they are reachable, saving engineering effort and greatly reducing the risk of human errors. Secondly, it can run on new design revisions, eliminating the risk of relying on results that could have been invalidated by RTL changes. It is worth noticing though that FCA does not remove the need to review unreachable coverage targets. The challenge remains to judge whether or not these targets are expected to be unreachable, for example due to coding style or design parameters. Unexpected unreachable targets can point to RTL bugs, or to errors in the specification or implementation of the targets themselves. The first part of the paper reviews various characteristics of coverage targets and makes a case for the use of FCA. The second part presents an actual implementation of an FCA flow using Synopsys tools, and includes practical advice and pitfalls to be aware of. Finally, project results are reported and conclusions drawn. This paper assumes that the reader is familiar with state-of-the-art functional verification methods and tools for digital hardware designs. Page 3
  • 4. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 2. Concepts There is no single measure that can provide absolute confidence on the quality of a verification environment [3]. This has led to the development of a large variety of coverage metrics. This section reviews a few characteristics of coverage targets, relevant to industrial applications, and makes a case for the use of FCA. For a formalised and comprehensive survey of coverage metrics please refer to [3]. 2.1 Simulation-based Verification Coverage metrics for simulation-based verification are mostly well established and understood. It is nonetheless valuable, to set the context for this paper, to draw attention on some of their key characteristics. Control Coverage Exhaustive examination of all legal input stimuli is not feasible. Unfortunately this poses the risk of not exercising parts of the DUT. Bugs in DUT functionality that is not exercised would certainly go undetected. Control (or stimuli) coverage metrics are widely used in the industry to measure the quality of the stimuli created by the verification environment (which typically include directed and constrained-random stimuli generators.). Popular control coverage metrics are: line, branch, condition, finite state machine (FSM), toggle, sometimes collectively referred to as structural coverage metrics, and functional coverage targets (for example SystemVerilog coverpoints or cover property). Observation Coverage A verification environment can have a powerful stimuli generator, but poor bug-detection capability. To detect a bug the erroneous functionality has to be stimulated (or activated) in the DUT, and its effect must propagate to an observable point. Observable points are related to checkers able to detect a deviation of the DUT from its expected behavior. Observation coverage targets can be implemented by introducing small changes in the design code often referred to as mutations. In this context mutations are used to model functional bugs. A test hitting an observation coverage target will have to stimulate the introduced bug and detect it through a failing checker. Automatic Coverage Several EDA tools are able to automatically generate coverage targets, purely by a syntactic analysis of the DUT code. The Synopsys VCS simulator can generate line, branch, condition, and toggle coverage simply by enabling compilation options. Tools can also attempt to extract targets by analysing the semantics of the code. For example, certain coding styles can be recognized to represent a FSM: coverage targets can then be created for FSM states and transitions. A few tools can also automatically insert observation coverage targets, e.g. Synopsys Certitude. Very low engineering effort and expertise is required to create and maintain these targets. High figures for automatic coverage (e.g. 100% for code and branch coverage) are often a prerequisite for moving to more sophisticated and effort-intensive metrics. Manual Coverage To gain confidence on the quality of the verification environment it is often necessary to specify and implement coverage targets. This is a time-consuming task, often involving architects, designers, and verification engineers. The engineering effort to capture, implement and maintain these targets across design revisions is typically very high. The process requires general verification and specific DUT expertise. The risk of missing important targets is high as is the risk of making implementation mistakes. High effort might be required to update these targets following changes in the DUT specification or implementation. Standard Coverage Coverage targets implemented in a standard language, e.g. SystemVerilog, can be read by tools from various EDA vendors. On the other hand coverage models applied by tools to Page 4
  • 5. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 automatically extract targets are not standardised. Even a basic coverage metric as line coverage, cannot be expected to be implemented in exactly the same way by two different EDA tools. That means that different tools could extract different line coverage targets from the same DUT. Moreover, the extracted targets are not translated into standard languages but saved in proprietary databases. Abstraction Level of Coverage Coverage targets can be defined and implemented at various abstraction levels, including architectural, micro-architectural, interface and implementation. For example, in the case of a CPU, one could be interested in ensuring that: specific sequences of instructions have been executed (architectural level); certain protocol rules have been exercised at the memory interface (interface level); specific instruction couples have been issued at the same clock cycle (micro-architectural level); a fifo has gone through the sequence of conditions full- almost_full - full in 3 consecutive clock cycles (implementation level). Unreachable Coverage A coverage target may be impossible to hit. For a trivial example consider the coverage target tie0 == 1’b1 where tie0 is a signal that is stuck at zero in the DUT. Even in an exhaustive simulation where all legal input stimuli are examined, this target would not be hit. Considering that exhaustive simulation is not feasible in practice, this method cannot be used to prove that a coverage target is unreachable. Human reasoning can be used as an alternative method, but it cannot be automated and is prone to errors. It is also worth mentioning that a typical DUT might have many expected unreachable coverage targets. 2.2 Formal Verification FV tools can also automatically extract coverage targets and read targets written in standard languages. The characteristics reviewed for simulation coverage do apply to formal targets as well. The coverage models used and the definition of what it means to hit a target might be different. One of the key advantages of FV technology over simulation is that it exhaustively examines all input stimuli. Control coverage targets can be processed to determine whether they are reachable or unreachable. If the tool is able to prove that no input sequence to the DUT exists such that the target is exercised, the coverage item can be marked as unreachable. If the tool proves that the coverage target is reachable, it can also provide a trace showing an input sequence that drives the DUT into the desired scenario. This is sometimes referred to as witness trace. Input Constraints In both simulation and formal it is necessary to ensure that the DUT is not stimulated with illegal input sequences. Illegal inputs can cause false failures, i.e. failure not due to bugs in the DUT. Coverage targets could be unreachable because of input constraints. On the other hand a target that is unreachable when the DUT input is left totally unconstrained will also be unreachable if the space of legal input stimuli is restricted. Knowing if a target is unreachable because of input constraints is valuable as it may point to errors in the constraints themselves if this is unexpected. 2.3 Interoperability Ideally all coverage targets should be expressed in standard semantics, and coverage metrics saved in a standard database. This would enable interoperability between EDA vendors and verification technologies. Coverage targets extracted by a simulator could be hit by tests running on another simulator, or proven unreachable by a formal tool. Coverage databases could be read by yet another tool for analysis and display. Each of these tools could belong to a different EDA vendor. The reality is quite different. EDA vendors use proprietary databases and technology to extract coverage targets and store metrics. Page 5
  • 6. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 The Accelera System Initiative organisation has developed the Unified Coverage Interoperability Standard (UCIS) [4]. Currently UCIS defines a standard application interface (API) that should enable EDA vendors and users to read and write coverage databases compliant to this standard. This is a step in the right direction, but unfortunately UCIS adoption in the industry is still at an early stage. Moreover, the definition and adoption of standard coverage models, even for the most popular metrics, is even further away. For manual coverage targets it is possible to use standard languages. In SystemVerilog coverage targets can be implemented using the constructs coverpoint and cover property. The latter has the advantage of being supported by FV tools. 2.4 Coverage Closure Design teams typically achieve a decent level of coverage relatively easily. However, as they approach their ultimate goal, increasing the coverage level often becomes very challenging and time consuming. Targets that have not been hit need to be reviewed and the test suite improved. Functional areas that are hard to hit are less tested and therefore present a higher risk of errors. Formal Verification can complement simulation and speed up the coverage closure process in various ways. Certain functional areas hard to test with simulation, could in fact be sweet spot for FV tools and methodologies. More relevant to the scope of this paper, coverage holes that can be read by formal tools might be proven to be unreachable, saving valuable engineering effort compared to using human reasoning. On the other hand, for coverage targets proven to be reachable formal tools can provide witness traces. These traces could help in working out a simulation test able to hit the target. The next section of the paper presents an actual implementation of a FCA flow using Synopsys tools. Page 6
  • 7. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 3. Practicalities The implementation of a FCA flow requires a FV tool and a mean to read coverage targets. In the case of structural coverage targets, coverage reports from the simulator can be parsed to extract holes and generate corresponding coverage targets in a language accepted by the FV tool. This approach has been used successfully in the past [2], but users are required to build and maintain scripts translating information between tools. Nowadays major EDA vendors provide ready-made solutions that require minimal implementation effort. The Synopsys FCA flow involves three tools: VCS as simulator; VC Static as FV tool; Verdi to display and analyse coverage results. A graphical representation of the flow is shown in Figure 1. Figure 1: Synopsys FCA Flow No translation is required to exchange information between tools. The coverage database generated by VCS can be read by VC Static; the exclusion file generated by VC Static, containing information on the coverage targets proven unreachable, can be read by Verdi. This section presents the Synopsys FCA solution as applied to an industrial project. 3.1 The DUT The DUT to which the FCA flow has been applied is a complex subsystem. It contains several sub- modules and small memory structures, and is implemented in 10s of thousands of lines of synthesisable SystemVerilog code. 3.2 VC Static TCL Script The TCL script to drive the VC Static tool is reported below. The script is simple and reusable across projects. Comments are included to further clarify the meaning of the main commands. set top my_dut set inst_path {tbench_top.subsystem.dut[0].gen} Page 7
  • 8. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 # run overnight set_fml_var fml_max_time 12H set_app_var fml_mode_on true # use multiple licenses to reduce run time # at least 5 as 4 jobs will run in parallel even using 1 license set_grid_usage –type LSF=<int> –control <job_submission_command> # blackbox memories to reduce complexity set_blackbox *RAM* # OPTIONAL: read simulation coverage results generated by VCS read_covdb –cov_input merged_cov_database –cov_dut $inst_path.$top # compile DUT using same options (e.g. defines) used for VCS # no toggle coverage targets as there are too many read_file –single_step –format sverilog –cov line+cond+fsm_state+fms_trans –top $top –vcs “-f vcs_compile_options.f” # automatically setup clocks (at least attempt) infer_clock_roots report_clock_roots –file clk_setup.sdc read_sdc clk_setup.sdc # compute reset state – starting state for formal analysis # but do not force reset to be inactive during formal analysis sim_force {rsta_n} –apply 0 sim_force {rstb_n} –apply 0 sim_run 5 sim_save_reset # review what is being black boxed report_black_box –design # sanity check on tool setup check_fv_setup –block report_fv_setup –list –limit 10 # run formal proofs on coverage holes – may take several hours check_cov –block # save results in exclusion file that can be read in Verdi save_cov_exclusion –file proven_unreachable.el report_cov exit 3.3 Issues and Pitfalls The VC Static TCL script to implement the FCA flow is simple. Nonetheless, there are number of Page 8
  • 9. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 potential issues and pitfalls that have emerged during project execution. They are shared below. DUT Compilation It is critical that VCS and VC Static compile the same configuration of the DUT. As shown in the script, VC Static command read_file accepts option –vcs The same compilation control file can theoretically be used for both tools. Unfortunately though the VCS compilation is likely to include many options (links to libraries, flags, etc.), and some of those may create errors in VC Static. While Synopsys improves the integration between its tools, it is recommended to separate out the options relevant to VC Static from the others when compiling for VCS. Handling Complexity It is very likely that memories in the DUT will need to be black boxed. Black boxing is safe as it cannot lead to reachable targets being proven unreachable (the vice versa can happen, i.e. unreachable targets could be proven to be reachable because of black boxed components). It is of course possible to increase the maximum runtime of the tool and the number of licenses to the maximum available. Typically this flow will be run over the weekend and perhaps overnight in some occasions. Using coverage results from a good test suite will also decrease the number of uncovered targets that need to be analysed by the FV tool. DUT Reset As seen in the VC Static TCL script (and for FV tools in general) users have to declare DUT clocks and resets. Some FV tools might implicitly limit the behavior of reset inputs which could lead to results showing reset initialization code proven unreachable. This behavior can be avoided in VC Static by using the command sim_force instead of create_reset. For more details please refer to [5]. Module-Based Exclusions The command to read the VCS coverage database is optional. If omitted, VC Static will analyse all coverage targets. Clearly this might impact run time. Moreover, in this case the generated exclusion file will be module-based, rather than instance-based. A module-based exclusion file has to be used with care as it is not valid for a different configuration of the DUT (for example a DUT instantiated with parameters values different from the ones for which the exclusion file was computed). Input Constraints In this project no constraints were used to restrict the DUT input stimuli. This is a safe choice: there is no risk of proving coverage targets to be unreachable because of mistakes in the assumptions. It is also a pragmatic choice: implementing assumptions is difficult and time consuming. The drawback is that coverage targets proven to be reachable could in fact be unreachable when considering only legal input stimuli. Top Level The top level chosen in the VC Static script is the highest possible in the context of the project. It would be possible to do multiple runs on sub-modules to reduce complexity. Results would still be valid but it is likely that less coverage targets would be proven unreachable. This is because the logic surrounding a sub-module has the same effect as input constraints. Running the FCA flow on the highest possible top level is therefore the preferred choice. Inconsistent Coverage Options Unfortunately the coverage options accepted by VCS and VC Static are not yet fully consistent. VCS accepts the following options: line, branch, cond, fsm, tgl, and assert. VC Static on the other hand accepts options: line, cond, tgl, fsm_state and fsm_trans. Inconsistent Results It is possible that coverage targets proven unreachable by VC Static could in fact be hit in a simulation test. Inconsistent results might be revealed using the following two steps. Firstly run VC Static to generate a module-based exclusion file, thus without reading a coverage database. Secondly read both the simulation coverage database and the exclusion file in Verdi, but applying option –excl_strict. ($ verdi –cov –excl_strict –covdir DB_DIR –elfile proven_unreachable.el). Verdi will report any attempt to exclude coverage targets that are marked as hit in the database. These targets can also be saved into a file. A complete explanation of this phenomenon goes beyond the scope of this paper. One of the reasons why this can happen is that simulators interpret the DUT Page 9
  • 10. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 HDL code according to simulation semantics. FV tools on the other hand tend to be closer to synthesis semantics, at least in their typical default settings. Targets proven to be unreachable could in fact be reached in simulation due to Xs or glitches. In other words, the coverage results reported by simulators could be over optimistic. VCS compilation option –cm_glitch can help reducing this problem (for more information, please refer to the relevant Synopsys documentation). However, for unclear reasons, when using this option VCS is not able to collect fsm coverage. It would also be safer to start collecting coverage information a number of cycles after reset has been released. Tests could cover the DUT initialization functionality by re-triggering reset during normal DUT operation. VCS does not have a coverage option to easily allow for this. Review Results It is important to reiterate that FCA cannot judge on whether a coverage target is expected to be unreachable or not. Coding style and DUT parameters can result in many expected unreachable coverage targets (for a given DUT configuration). Review of FCA results is valuable because unexpected unreachable targets could point to DUT bugs. The engineering effort necessary to review FCA results can be reduced using several strategies. For example, strict coding guidelines could be applied to mark expected unreachable code lines, e.g. default branches of case statements. VC Static supports automatic detection of intentionally unreachable lines according to specific coding guidelines. For more information please refer to the VC Static documentation. Language features (e.g. generate statements) can also be used to reduce expected unreachable targets. Whenever possible this should be the preferred approach as it is tool independent. 3.4 Formal Functional Coverage Analysis The previous sections have focused on the analysis of structural coverage targets. A preliminary flow to apply the same process to functional coverage targets has also been developed as part of this project. Hundreds of SystemVerilog properties have been written by design and verification engineers. Cover properties directly express coverage targets. Assume and assert properties might also be processed by tools to automatically extract coverage targets. As an example, for a property expressed as A |-> B, it is sensible to ensure that the coverage target A = True can be reached by the DUT. Should the condition be unreachable, the property will never trigger in any simulation. As already pointed out, this could be expected, for example due to the specific DUT configuration under analysis, or could point to a DUT bug or a mistake in the property. The TCL script to drive the VC Static tool is reported below. Once again, the script is simple and reusable across projects. Comments are included to further clarify the meaning of the main commands. The main differences with the previous script are: automatic generation of exclusion file is not supported; reading of simulation coverage results is not supported. set top my_dut # run overnight set_fml_var fml_max_time 12H set_app_var fml_mode_on true # use multiple licenses to reduce run time # at least 5 as 4 jobs will run in parallel even using 1 license set_grid_usage –type LSF=<int> –control <job_submission_command> # blackbox memories to reduce complexity set_blackbox *RAM* Page 10
  • 11. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 # compile DUT using same options (e.g. defines) used for VCS # note use of option –sva to include properties read_file –sva –single_step –format sverilog –top $top –vcs “-f vcs_compile_options.f” # automatically setup clocks (at least attempt) infer_clock_roots report_clock_roots –file clk_setup.sdc read_sdc clk_setup.sdc # compute reset state – starting state for formal analysis # but do not force reset to be inactive during formal analysis sim_force {rsta_n} –apply 0 sim_force {rstb_n} –apply 0 sim_run 5 sim_save_reset # review what is being black boxed report_black_box –design set cover_props [get_props -type cover] set assert_props [get_props -type assert] set assume_props [get_props -type assume] # be safe and turn all block level assumptions into assertions # leave DUT inputs unconstrained fvassert $assume_props fvenable * # run formal proofs check_fv –block # report will mark which properties are unreachable report_fv $cover_props –list report_fv $assert_props -list report_fv $assume_props -list exit 3.5 Results The results reported refer to a specific FCA flow run done before final project delivery. The tool was run overnight (12 hours maximum run time) using several licenses. Load Sharing Facility (LSF) was used to distribute jobs over a Linux compute cluster. The VC Static version used was 2015.09sp1_1. The effort required to setup the initial flow, for a user with extensive formal verification experience, but no previous exposure to VC Static, was 2 days. A further 5 days of effort, over the course of about 2 months, were required to unveil and address issues and improve the flow. A simulation coverage database, derived by a suite of 2000 mixed random and directed tests, was read before starting the formal analysis. VC Static analysed a total of 6045 line coverage targets and 11304 condition coverage targets. It could prove that 420 lines and 3195 conditions were Page 11
  • 12. Formal Coverage Analysis: Concepts and Practicalities SNUG 2016 unreachable. No inconclusive (i.e. tool giving up) proofs were reported. In other words, all other targets were proven to be reachable. It was not possible to measure the engineering effort saved by applying this flow. The impact on achieved DUT quality could not be assessed either. Nonetheless, considering the low effort necessary to implement the FCA flow for the first time, and its high reusability across projects, there is certainty of a positive Return On Investment (ROI). This consideration however is purely technical and does not take into account licensing costs. 4. Summary and Conclusions This paper reviews important aspects of coverage targets and shows how FV can contribute to the coverage closure process. Implementing a FCA flow is relatively easy and requires low effort. Nonetheless, there are issues and pitfalls to be aware of, as shown in section 3.3. Results can give a valuable contribution to achieving the project’s coverage goals while saving engineering effort. The process and tools presented are relatively mature, although improvements are still necessary. On a more general note, the development and adoption of coverage standards by the industry would enable smoother, more powerful and reliable FCA flows. 5. References [1] http://www.oxforddictionaries.com/definition/english/coverage [2] T. Blackmore, D. Halliwell, P. Barker, K. Eder and N. Ramaram, “Analysing and closing simulation coverage by automatic generation and verification of formal properties from coverage reports”, Integrated Formal Methods, Lecture Notes in Computer Science, Volume 7321, pp. 84-98, 2012 – Springer [3] H. Chockler, O. Kupferman, and M. Y. Vardi, “Coverage Metrics for Formal Verification”, Correct Hardware Design and Verification Methods (CHARME), pp. 111–125, 2003 [4] http://www.accellera.org/activities/working-groups/ucis [5] Difference between create_reset and sim_force commands for reset block evaluation in VC FCA, Doc Id: 1870149, Synopsys Solvnet Page 12