Your SlideShare is downloading. ×
  • Like
Larson assertions 081705
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Larson assertions 081705

  • 992 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
992
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
7
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Using PSL for Assertions and Coverage at Analog Devices Kelly Larson Austin Design Center August 17, 2005
  • 2. Topics Covered Why the fuss? Uses of Assertions Dynamic property checking Functional coverage Formal Verification Other Uses Conclusion —Analog Devices Confidential Information—
  • 3. New and Improved!! —Analog Devices Confidential Information—
  • 4. Why all the interest? Assertions standards widely accepted PSL (formally sugar) SVA OVL Assertion languages are concise, well suited Maintainable Multiple Vendors Multiple Tools Multiple Uses —Analog Devices Confidential Information—
  • 5. Who writes the assertions? Block designers Low-level design assumptions Coverage points of concern Verification Team Interfaces Behavioral Is it described in the spec? Checkers, protocol Architects Starvation, liveness, bandwidth —Analog Devices Confidential Information—
  • 6. What are the uses? Dynamic property checking Functional Coverage Formal Verification Other uses —Analog Devices Confidential Information—
  • 7. Dynamic Property Checking The Good Works great No detectible performance hit (so far) The Bad Integration into environment may not be straight-forward No ability to query status from within the simulation Must post-process Debugging is hard. Poor visibility Experimentation Debussy may help, eventually Other observations PSL by itself not sufficient. Auxiliary code —Analog Devices Confidential Information—
  • 8. Functional Coverage One of our stated metrics for tape-out is 100% functional coverage. Functionality mainly derived from specification, identified in testplan. Further refined with input from testplan reviews and code coverage analysis. PSL supports “cover” statement in addition to “assert” Much smarter than code coverage, though involves much more effort. —Analog Devices Confidential Information—
  • 9. What is a functional coverage check? Coverage checks have nothing to do with correct behavior of the system, only correct behavior of the tests. A coverage check may or may not have an associated property check within the same statement. The “coverage” part will be on the left hand side of the implication operator. Coverage checks need not ever fail. —Analog Devices Confidential Information—
  • 10. Global vs. Test Coverage Tools are great at providing overall functional coverage data Tables show all of the assertions hit during a simulation. Can concatenate results from multiple simulation runs. Can rank tests based on how much additional coverage they provide. —Analog Devices Confidential Information—
  • 11. Global vs. Test Coverage (cont.) What if I want to focus on the behavior of a single test? If my test doesn’t do what I want it to do, I’d like it to fail! Current tools do not have support for using specific coverage points to affect individual simulation runs. These capabilities need to be developed by the user. —Analog Devices Confidential Information—
  • 12. PSL For Property Checking vs. PSL For Test Coverage PSL For Property Checking PSL For Test Coverage Relies mainly on Relies mainly on functionality supplied functionality customized in through tools. environment. Will fail anytime wrong Only fails for coverage when behavior is detected. coverage point is enabled for Not tied to specific testcase. specific testcase. Much overlap exists because both are built upon the PSL language. Conceptually, however, they are very different things. —Analog Devices Confidential Information—
  • 13. Adding Capability – One Approach Tools already provided ways of analyzing coverage globally, but we also wanted the ability to specify coverage points required for an individual test. Test should fail if it doesn’t accomplish what it was written to do. Needed to fit in nicely with existing methodology. Solution was to implement an additional post-processing step which was controlled by command-line options for each individual test. +RequireAssert +IgnoreAssert +ProhibitAssert —Analog Devices Confidential Information—
  • 14. Test Robustness Tests which are broken should always fail. This means more than simply having self-checks. Example from previous DSP project: 1. Chip supported four different bus ratios. These could be specified on the command line. If nothing specified, a default ratio was used. 2. Self-checking directed tests implemented, the majority used the default ratio. 3. Test behavior verified with waveform viewers, and added to regressions. 4. Product engineering wanted to use tests to verify silicon, but needed representation from all ratios. Command line options were changed to help balance the number of tests run in each ratio. 5. Tests still passed all self checks. Life was good? WRONG! In many cases the functionality being tested for completely went away. There was no way to translate the broken test into a failure, however, since everything still looked fine to the testcase. —Analog Devices Confidential Information—
  • 15. Formal Verification Assertions make this path easier Dynamic assertions used for formal analysis Added constaints to formal analysis get verified as dynamic assertions Misconceptions This is done late in project, after everything else Earlier the better Primarily a verification tool Possibly an even better design tool —Analog Devices Confidential Information—
  • 16. Assertion Tracking Assertion tracking capability added to Austin environment. Enabled by a “-track” option with the simulation command. All assertion data for all tests run is stored in mySQL results database. Assertion coverage data is then browsable through a series of web cgi scripts. —Analog Devices Confidential Information—
  • 17. Assertion Tracking (cont.) What is it good for? To answer questions like: Do we currently have a test for “feature x”? Which one? What types of bus activity does “test x” induce? What interrupts? I’d like to test for a variety of concurrent activity. I can describe the simultaneous events in PSL, do we have a test which covers this condition? I’ve written some randomization routines, does it induce the behavior I’m expecting? I’ve finished with my PSL checks, and all my directed tests. Do I have full functional coverage on all of the PSL I wrote? —Analog Devices Confidential Information—
  • 18. —Analog Devices Confidential Information—
  • 19. —Analog Devices Confidential Information—
  • 20. —Analog Devices Confidential Information—
  • 21. Test Harvesting Some tests are difficult to write because they involve trying to demonstrate interaction between signals that you do not have direct control over. Example: Concurrent transactions from different system busses into a bus arbiter. The conditions to test for can usually be easily described with PSL. Randomization can then be used to stimulate the device, the PSL coverage statements can be tracked, and tests with the desired behavior can be “harvested” from random test runs as a directed test. More efficient than writing by hand. If a test breaks due to a design change, the same procedure can be repeated to “harvest” another testcase. —Analog Devices Confidential Information—
  • 22. Other Uses Speedpath testing Simple scripts can be used to generate assertions which “fire” when certain speedpaths are hit. Once this is instrumented, the entire testsuite can be run to extract the best tests for hitting the desired speedpaths. Clock analysis Again, simple perl scripts generated assertions for firing when clocks in different areas of the chip were active. This enabled a detailed verification and analysis of different low power modes where clocks were dynamically enabled and disabled. —Analog Devices Confidential Information—
  • 23. Enhanced Visibility Coverage assertions can be thought of as a way to verify waveforms in batch mode. PSL can be efficient way to communicate testing needs between design and verification engineers. PSL can be reviewed more easily during a final testplan review than the testcode itself. —Analog Devices Confidential Information—
  • 24. Our Future Goals & Strategy All new checkers will be written in PSL. Easier to maintain. Easier to reuse. Allows leverage into formal verification effort. Provides base for tying tests to assertions for coverage. With very few exceptions, all future directed tests will require at least one assertion for coverage to determine the pass/fail of the test. Testplans reference assertions, and the assertion strategy, in addition to the directed tests and random test strategy. —Analog Devices Confidential Information—
  • 25. Summary PSL and other assertion methods are crucial because they allow the same efforts to be leveraged in multiple ways. While many tools and methodologies are still fairly new, the value added is worth the effort today. —Analog Devices Confidential Information—
  • 26. More Information Using PSL/Sugar for Formal and Dynamic Verification Ben Cohen, Srinivasan Venkataramanan, and Ajeetha Kumari Using PSL for Functional Coverage Webinar Kelly Larson http://www.cadence.com/company/events/webinars.aspx Accellera PSL 1.1 LRM (Language Reference Manual) http://www.accellera.org/ IEEE P1850 http://www.eda.org/ieee-1850/ Cadence ABV documents Under docs/ directory of local Cadence tools (IUS) installation directory writeabv.pdf - Simulation-Based Assertion Writing Tutorial abvguide.pdf - Simulation-Based Assertion Checking Guide abvtutorial.pdf - Simulation-Based Assertion Checking Tutorial —Analog Devices Confidential Information—
  • 27. Presented By: Kelly Larson Austin Design Center Analog Devices, Inc. 6500 Riverplace Blvd. Bldg IV, Suite 300 Austin, TX 78730 PHONE 512-427-1094 kelly.larson@analog.com —Analog Devices Confidential Information—