Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Populations, Variety, and Selection: Verifying Complex Designs

289 views

Published on

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Populations, Variety, and Selection: Verifying Complex Designs

  1. 1. Populations, Variety and Selection: Verifying Complex Designs Austin DVClub 4/19/2011 Ken Albin ken.albin@systemsemantics.com5/6/2011 1
  2. 2. What is going on? Why is it that no matter how thorough we are:  customers still find bugs?  randomization finds combinations we didnt think about for our directed tests?  writing formal properties finds errors and ambiguities in the specification? Is doing the same thing more efficiently really going to get us to a higher level of quality sooner?5/6/2011 Austin DVClub, 2 ken.albin@systemsemantics.com
  3. 3. Why this talk? Verification planning is often focused on low- level details without first considering the higher- level alternatives. Focus on ROI seems to be taking us away from methods employing variety. Complexity is the root problem we struggle with – what can we learn from complexity research?5/6/2011 Austin DVClub, 3 ken.albin@systemsemantics.com
  4. 4. What are complex systems? It turns out that there are several definitions and models of complex systems, generally tuned to the field in which they are being used. In the book Harnessing Complexity, Axelrod and Cohen describe systems “...where the full consequences of actions may be hard – even impossible – to predict.” Their Complex Adaptive Systems model captured ideas I saw applied in successful verification planning and execution efforts.5/6/2011 Austin DVClub, 4 ken.albin@systemsemantics.com
  5. 5. A Brief Introduction to Complex Adaptive Systems5/6/2011 Austin DVClub, 5 ken.albin@systemsemantics.com
  6. 6. Complex Adaptive Systems (CAS)  Axelrod and Cohens CAS framework combines aspects of two familiar complex systems:  Biological systems  Agent-based software systems The result is a framework for reasoning about complex systems – not a magic oracle, but an aid to decision making.5/6/2011 Austin DVClub, 6 ken.albin@systemsemantics.com
  7. 7. A simplified CAS model Agent/Strategy 1 Agent/Strategy 2 Agent/Strategy 3 Done! As an example, lets Agent/Strategy 4 talk about growing apples. ... Agent/Strategy N5/6/2011 Austin DVClub, 7 ken.albin@systemsemantics.com
  8. 8. Populations of agents or strategies Red Delicious Golden Delicious Granny Smith Done! Gala In CAS, there is a pool of different types of agents or strategies in action, each ... attempting to accomplish the same goal. Fuji5/6/2011 Austin DVClub, 8 ken.albin@systemsemantics.com
  9. 9. Selecting the amount of variety Red Delicious Golden Delicious Granny Smith Done! Gala Weighting of the different agents or strategies ... happens initially and is adjusted during execution. Fuji5/6/2011 Austin DVClub, 9 ken.albin@systemsemantics.com
  10. 10. Exploitation (limiting variety) Red Delicious Golden Delicious Granny Smith Done! Gala Exploitation selects only the current best solution and avoids less efficient ... strategies. Fuji5/6/2011 Austin DVClub, 10 ken.albin@systemsemantics.com
  11. 11. When to limit varietyExploitation is appropriate if: You are sure you have the best solution The problem is not going to changeIn our example, if you have determined everyones favorite apple and are sure there will be no changes in market preference and ideal growing conditions, then producing other types of apples would be a waste of resources.5/6/2011 Austin DVClub, 11 ken.albin@systemsemantics.com
  12. 12. Exploration (encouraging variety) Red Delicious Golden Delicious Granny Smith Done! Gala Exploration tries and evaluates different strategies when the best solution is not ... known. Fuji5/6/2011 Austin DVClub, 12 ken.albin@systemsemantics.com
  13. 13. When to encourage varietyExploration is valuable for problems that: are long term or widespread provide fast reliable feedback have low risk of catastrophe from exploration * have looming disasterWith apple trees, some amount of variety seems prudent.5/6/2011 Austin DVClub, 13 ken.albin@systemsemantics.com
  14. 14. Exploitation vs. Exploration The big knob to turn in a CAS is the amount of variety in the agents or strategies.  It is often the case that a CAS begins in exploration mode and moves to exploitation over time.  However, reducing variety too quickly results in premature convergence, and too slowly results in eternal boiling5/6/2011 Austin DVClub, 14 ken.albin@systemsemantics.com
  15. 15. Interaction Red Delicious Golden Delicious Granny Smith Done! Gala Interaction can create superior new hybrid varieties or strategies ... which produce better results than the sum of the Fuji parts.5/6/2011 Austin DVClub, 15 ken.albin@systemsemantics.com
  16. 16. Extinction Red Delicious Golden Delicious Granny Smith Done! Gala The loss of available variety may seriously impact the adaptability and ... robustness of a system. Fuji5/6/2011 Austin DVClub, 16 ken.albin@systemsemantics.com
  17. 17. Example: intelligent agents in action5/6/2011 Austin DVClub, 17 ken.albin@systemsemantics.com
  18. 18. Under the hood in formal tools BDD-based fixpoint BDD-ATPG Random simulation Done! Note: once a successful SAT solver combination of engines is found through ... exploration, it can be exploited in “regression Symbolic simulation mode” as long as the design doesnt change much.5/6/2011 Austin DVClub, 18 ken.albin@systemsemantics.com
  19. 19. Applying the framework to verification planning and execution5/6/2011 Austin DVClub, 19 ken.albin@systemsemantics.com
  20. 20. Checklist for applying the frameworkIdentify populationIdentify selection criteriaAnalyze impact of variety  Exploitation  Best solution?  Stable problem?  Exploration  Long term problem?  Fast, reliable feedback?  Looming disaster?Analyze interaction benefitsAnalyze extinction risk5/6/2011 Austin DVClub, 20 ken.albin@systemsemantics.com
  21. 21. Example: is it worth achieving 100% coverage?5/6/2011 Austin DVClub, 21 ken.albin@systemsemantics.com
  22. 22. The last few percent 100% coverage 0% time/effort5/6/2011 Austin DVClub, 22 ken.albin@systemsemantics.com
  23. 23. Verification planning and execution Coverage analysis and hole filling Writing functional coverage Writing assertions Done! Bug finding or proof of correctness Code reviews ... Booting a microkernal5/6/2011 Austin DVClub, 23 ken.albin@systemsemantics.com
  24. 24. 100% coverage analysis Not known to be best solution,Identify population correlating coverage to design changes over time is a problem.Identify selection criteriaAnalyze impact of variety Finding bugs is a problem throughout the product  Exploitation lifetime, methods finding  Best solution? bugs are quickly noticed,  Stable problem? disaster is always looming.  Exploration Comparison of source and  Long term problem? functional coverage, joint  Fast, reliable feedback? code/coverage reviews, formal  Looming disaster? reachability analysis dont require 100% coverage..Analyze interaction benefitsAnalyze extinction risk Infrastructure and skills either not created or going stale limit adaptability.5/6/2011 Austin DVClub, 24 ken.albin@systemsemantics.com
  25. 25. Example: should multiple simulators be used?5/6/2011 Austin DVClub, 25 ken.albin@systemsemantics.com
  26. 26. Use of multiple simulators IFV SystemVerilog Questa SystemVerilog VCS SystemVerilog Done! UVM support, Clarity of messages, Symbolic simulator Efficiency, cost ... Note: this analysis is focused Spice more on the robustness of the environment rather than efficiency in finding bugs.5/6/2011 Austin DVClub, 26 ken.albin@systemsemantics.com
  27. 27. Choice of multiple simulators One simulator may be more efficientIdentify population for some classes of problem, but if new language features are beingIdentify selection criteria added (e.g., SVTB) the problem may not be stable.Analyze impact of variety The feature implementations may not  Exploitation be resolved by the end of the project,  Best solution? feedback is nearly immediate, and  Stable problem? disaster is always looming.  Exploration  Long term problem? Cross check LRM conformance,  Fast, reliable feedback? clarity of error messages, identifying  Looming disaster? race conditions.Analyze interaction benefits Alternatives not available in case ofAnalyze extinction risk implementation show stopper or vendor switch.5/6/2011 Austin DVClub, 27 ken.albin@systemsemantics.com
  28. 28. Other verification examples  emulation vs. sim farm  rerunning unit tests vs. system tests  static analysis vs. simulation  code reviews vs. writing assertions5/6/2011 Austin DVClub, 28 ken.albin@systemsemantics.com
  29. 29. Key ideas  We need to consider exploitation vs. exploration trade-offs in verification planning and execution.  Variety from exploration provides more robustness and is more easily adapted to changing requirements.  Plan for interaction between strategies to achieve better overall results.5/6/2011 Austin DVClub, 29 ken.albin@systemsemantics.com
  30. 30. Further infoThere are many different definitions of complex systems, but the ideas for this talk were based on the definition of Complex Adaptive Systems found in this book:Harnessing Complexity: Organizational Implications of a Scientific Frontier (2001) by Robert Axelrod, Michael D Cohen (amazon link)5/6/2011 Austin DVClub, 30 ken.albin@systemsemantics.com

×