Class 1

910 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
910
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Class 1

  1. 1. CES 592 Telecommunications System Product Verification Sonoma State University <ul><li>Class Lecture 1 : Introduction </li></ul>Fall 2004
  2. 2. Instructors <ul><li>France Antelme </li></ul><ul><li>Ario Bigattini </li></ul><ul><li>Jaseem Masood </li></ul><ul><li>Steven Woody </li></ul><ul><li>Coordinator: Prof Ravi Kumar </li></ul><ul><li>Website: </li></ul>
  3. 3. General <ul><li>All lecture material will be downloadable from the website the day after the lecture </li></ul><ul><li>Instructors will be available for the half an hour prior to the onset of lecture every week unless by prior arrangement via e-mail. </li></ul><ul><li>Class mark will be made up of three projects each counting 20% and a final project counting 40% of the overall mark. </li></ul><ul><li>Course guidelines are posted on the website. </li></ul><ul><li>This is a new course and there may thus some changes from the original course curriculum as the semester progresses </li></ul>
  4. 4. Telecom Product Verification Introduction <ul><li>Overall High Level Flow Chart </li></ul><ul><li>Elements of Product Verification </li></ul><ul><ul><li>Hardware Verification </li></ul></ul><ul><ul><li>Software Verification </li></ul></ul>
  5. 5. Telecom Product Verification Introduction <ul><li>This course focuses on the following: </li></ul><ul><li>Verification of a new Telecom product prior to release from Engineering Development to Production. We do not consider Production Testing to be a component of Product Verification </li></ul><ul><li>“ Black Box” Box testing techniques with some application of “Grey Box” Testing techniques </li></ul><ul><li>A practical approach to testing techniques and methodologies </li></ul>
  6. 6. Telecom Product Verification Testing techniques <ul><li>Black Box Testing: </li></ul><ul><li>Also known as functional testing </li></ul><ul><li>A software testing technique whereby the internal workings of the item being tested are not known by the tester. For example, in a black box test on a software design the tester only knows the inputs and what the expected outcomes should be and not how the program arrives at those outputs. The tester does not ever examine the programming code and does not need any further knowledge of the program other than its specifications. </li></ul><ul><li>Reference http://www.webopedia.com/TERM/B/Black_Box_Testing.html </li></ul>
  7. 7. Black Box Testing Equipment Under Test (EUT) External stimulus External stimulus External stimulus External stimulus <ul><li>Physical layer: </li></ul><ul><li>Input signal </li></ul><ul><li>Optical </li></ul><ul><li>Line </li></ul><ul><li>Operating Environmental: </li></ul><ul><li>Vibration </li></ul><ul><li>Temperature </li></ul><ul><li>Additional Stresses: </li></ul><ul><li>Environmental </li></ul><ul><li>Software functionality: </li></ul><ul><li>Logical layer </li></ul><ul><li>Control layer </li></ul><ul><li>Management layer </li></ul>Output Output Output Output Entire EUT (“Black Box”) is subjected to stimuli – not individual circuit cards within the unit.
  8. 8. Telecom Product Verification Testing Techniques <ul><li>White Box Testing: Also known as glass box , structural , clear box and open box testing . </li></ul><ul><li>A software testing technique whereby explicit knowledge of the internal workings of the item being tested are used to select the test data. Unlike black box testing , white box testing uses specific knowledge of programming code to examine outputs. The test is accurate only if the tester knows what the program is supposed to do. He or she can then see if the program diverges from its intended goal. White box testing does not account for errors caused by omission, and all visible code must also be readable. </li></ul><ul><li>Reference: http://www.webopedia.com/TERM/B/Black_Box_Testing.html </li></ul>
  9. 9. Telecom Product Verification Testing techniques <ul><li>Grey Box Testing: </li></ul><ul><li>A software testing technique whereby a combination of Black Box testing and white box testing approaches are utilized. A partial knowledge of the internal workings of the item being tested is known to the tester. Limited stimuli are applied to the internal workings of the item under test and the remaining test focus is on the Black Box approach applying inputs to the item under test and observing the outputs. </li></ul>
  10. 10. The Product Development Cycle System Spec Software Spec New Product Idea Software Development SW Unit Test HW-SW Integration Hardware Spec Hardware Development HW Unit Test Product Verification Engineering Development functions Product Verification functions Customer & market Driven inputs Product Line Management & Engineering inputs Release to manufacture
  11. 11. Product Verification Phase Formal Product Verification Phase Software Verification HW-SW Integration Test HW Compliance & Agency approvals Release to Production HW Stress Testing HW Standards/Reqts testing Software Verification Volume Production
  12. 12. Telecom Product Verification Introduction <ul><li>Overall High Level Flow Chart </li></ul><ul><li>Elements of Product Verification </li></ul><ul><ul><li>Hardware Verification </li></ul></ul><ul><ul><li>Software Verification </li></ul></ul>
  13. 13. Telecom Product Verification Introduction <ul><li>We will also apply the terms “Black Box” and “White Box” testing loosely to Hardware Verification i.e. The entire Equipment under test is subjected to external stimuli and the resulting outputs will be verified according to the product requirements and standards. </li></ul>
  14. 14. Telecom Product Verification Introduction <ul><li>Hardware Verification: Unit Test Phase </li></ul><ul><li>First prototypes are typically tested by the HW development Engineering team </li></ul><ul><li>The Hardware feature is tested at the subs-ystem level to ascertain performance of all subsystems to the design specification. These are areas such as transmitters, receivers, PLL’s, busses, Microprocessors and memory </li></ul><ul><li>Analogous to Software “White Box ”testing approach </li></ul><ul><li>Usually execute a full a subset of formal test plan and, in addition execute a series of sub-system level design tests to ensure that all critical subsystems are functioning within the design specification and design parameters. </li></ul><ul><li>Successful completion of this Test Phase as measured against the exit criteria for this phase, result in entry into the formal test phase </li></ul>
  15. 15. Telecom Product Verification Introduction <ul><li>Hardware Verification: Formal Test Phase </li></ul><ul><li>Equipment Under Test (EUT) is verified against the Product, Industry and Regulatory requirements </li></ul><ul><li>The Hardware EUT is tested at the platform or system level i.e. All constituent hardware features (“cards” or “circuit packs”) integrated into a functional system chassis </li></ul><ul><li>Analogous to Software “Black Box”testing approach </li></ul><ul><li>Verify that the entire System (i.e router, switch etc.) functions as per the design specifications </li></ul><ul><li>Usually execute a full set of the tests required to ensure functionality according to the Hardware feature specification </li></ul><ul><li>Successful completion of this Test Phase as measured against the exit criteria for this phase, result in release of the Hardware platform to Production </li></ul>
  16. 16. Elements of Hardware Verification Hardware Verification Compliance & Agency Approvals Stress Testing HALT/HASS Standards based Testing Physical Layer Logical Layer EMC Safety NEBS Telecom Design Stress Testing Accelerated Life-cycle Testing (Beyond normal operating limits) Where does it break?
  17. 17. Telecom Product Verification Introduction <ul><li>Hardware Verification: Standards based testing- Equipment Under Test (EUT) is verified by own test organization (i.e. self certified) against the Product, Industry requirements for performance of the following: </li></ul><ul><li>Physical layer </li></ul><ul><ul><li>Transmitters </li></ul></ul><ul><ul><ul><li>Eye pattern, Pulse mask </li></ul></ul></ul><ul><ul><ul><li>Line frequency </li></ul></ul></ul><ul><ul><ul><li>Line Build-out </li></ul></ul></ul><ul><ul><ul><li>Rx Sensitivity, Dispersion/Reflection power penalty </li></ul></ul></ul><ul><ul><ul><li>Laser line width, Side Mode suppression etc </li></ul></ul></ul><ul><ul><ul><li>Tx Power output </li></ul></ul></ul><ul><ul><ul><li>Jitter generation and jitter tolerance </li></ul></ul></ul><ul><ul><li>Receivers </li></ul></ul><ul><ul><ul><li>Rx Sensitivity, Dispersion/Reflection power penalty </li></ul></ul></ul><ul><ul><li>Synchronization systems ( Where applicable) </li></ul></ul><ul><ul><ul><li>Wander (<10Hz) & Jitter (>10Hz) performance: MTIE/TDEV </li></ul></ul></ul><ul><ul><ul><li>Interface Jitter/Wander (Optical and Line) </li></ul></ul></ul><ul><li>Logical and Higher Layers: </li></ul><ul><ul><li>Framing </li></ul></ul><ul><ul><li>Alarming </li></ul></ul>
  18. 18. Telecom Product Verification Introduction <ul><li>Hardware Verification: Compliance Testing: Equipment Under Test (EUT) is verified against the Industry and Regulatory requirements and certified compliance by independent laboratories: </li></ul><ul><li>EMI/EMS/ESD: </li></ul><ul><ul><li>EM radiated & Conducted emissions </li></ul></ul><ul><ul><li>EM radiated and conducted susceptibility </li></ul></ul><ul><ul><li>Electrostatic Discharge tolerance </li></ul></ul><ul><li>Safety: e.g. Underwriters Laboratory etc. </li></ul><ul><li>Telecom interface: i.e performance certification for inclusion in the Public network </li></ul><ul><li>Power and grounding: Verification of Power and Grounding layout to regulatory requirements </li></ul><ul><li>Environmental: Very coarse Industry environmental operation/storage under hi/lo temp, airborne contaminants, Shock and vibration, Flame resistance, Heat generation etc </li></ul>
  19. 19. Telecom Product Verification Introduction <ul><li>Hardware Verification: Stress Testing: Equipment Under Test (EUT) is subjected to external stresses to excite failures </li></ul><ul><ul><ul><li>HALT: H ighly A ccelerated L ifecycle T ests: EUT subjected to stress to determine failure modes </li></ul></ul></ul><ul><ul><ul><li>HASS: H ighly A ccelerated S tress S creen: EUT is subjected to elevated stresses in order to ensure that there is high yield in production </li></ul></ul></ul><ul><ul><ul><li>The point of this type of test activity is to use elevated stresses in order to excite failure modes of very low probability of occurrence in a short period of time . The failure mode is remedied and the remedy is used to drive higher yields in production and lower customer failure rates </li></ul></ul></ul>
  20. 20. Telecom Product Verification Introduction <ul><li>Overall High Level Flow Chart </li></ul><ul><li>Elements of Product Verification </li></ul><ul><ul><li>Hardware Verification </li></ul></ul><ul><ul><li>Software Verification </li></ul></ul>
  21. 21. Telecom Product Verification Introduction <ul><li>Software Verification Test Phases Defined: </li></ul><ul><li>Plan the test strategy </li></ul><ul><li>Design the test cases </li></ul><ul><li>Execute the test cases </li></ul><ul><li>Report test results </li></ul><ul><li>Testing schedules are always a balance between quality and time-to-market </li></ul>
  22. 22. Telecom Product Verification Introduction <ul><li>Test Execution Phases Defined </li></ul><ul><li>Unit Testing </li></ul><ul><li>Integration Testing </li></ul><ul><li>Feature Testing </li></ul><ul><li>Systems Testing </li></ul>
  23. 23. Telecom Product Verification Introduction <ul><li>Software Verification: Unit Test Phase </li></ul><ul><li>First actual testing performed </li></ul><ul><li>Can be performed by developer on own code </li></ul><ul><li>Tests the new feature in isolation from other features </li></ul><ul><li>White box testing approach </li></ul><ul><li>Check for memory leaks, resource depletion </li></ul><ul><li>Fault Insertion testing </li></ul>
  24. 24. Telecom Product Verification Introduction <ul><li>Test Execution Phases Defined </li></ul><ul><li>Unit Testing </li></ul><ul><li>Integration Testing </li></ul><ul><li>Feature Testing </li></ul><ul><li>Systems Testing </li></ul>
  25. 25. Telecom Product Verification Introduction <ul><li>Software Verification: Integration Test Phase </li></ul><ul><li>Should be performed by an independent tester </li></ul><ul><li>Tests major functionality of the new feature immediately after the code branch has collapsed into the main branch </li></ul><ul><li>Should verify end-to-end functionality as soon as possible </li></ul><ul><li>Should be quick and frequent test – automated daily/weekly “sanity” or “smoke” test </li></ul>
  26. 26. Telecom Product Verification Introduction <ul><li>Test Execution Phases Defined </li></ul><ul><li>Unit Testing </li></ul><ul><li>Integration Testing </li></ul><ul><li>Feature Testing </li></ul><ul><li>System Testing </li></ul>
  27. 27. Telecom Product Verification Introduction <ul><li>Software Verification: Feature Test Phase </li></ul><ul><li>Verifies the new feature meets all requirements </li></ul><ul><li>Positive and Negative test cases </li></ul><ul><li>Performed with a structured test plan, and also with free-form exploratory testing </li></ul><ul><li>Manual and Automated testing </li></ul><ul><li>Black box testing approach </li></ul>
  28. 28. Telecom Product Verification Introduction <ul><li>Test Execution Phases Defined </li></ul><ul><li>Unit Testing </li></ul><ul><li>Integration Testing </li></ul><ul><li>Feature Testing </li></ul><ul><li>System Testing </li></ul>
  29. 29. Telecom Product Verification Introduction <ul><li>Software Verification: System Test Phase </li></ul><ul><li>Performed in a customer environment </li></ul><ul><li>Use-case testing to perform typical customer operations </li></ul><ul><li>Large scale testing “Scalability” (multiple nodes) </li></ul><ul><li>Interoperability testing with other vendor equipment </li></ul><ul><li>Stress testing “Availability” (max traffic throughput, simultaneous traffic types, bit errors, alarm storms…) </li></ul><ul><li>Security vulnerability testing </li></ul><ul><li>Performed with a structured test plan, and also with free-form exploratory testing </li></ul>
  30. 30. Telecom Product Verification Introduction <ul><li>Test Execution Phases Defined </li></ul><ul><li>Unit Testing </li></ul><ul><li>Integration Testing </li></ul><ul><li>Feature Testing </li></ul><ul><li>System Testing </li></ul><ul><li>More test phases? </li></ul><ul><li>Field Testing / Beta Testing </li></ul><ul><li>Testing User Documents </li></ul><ul><li>Regression Testing </li></ul><ul><li>Acceptance Testing </li></ul><ul><li>Code review / walkthrough / inspection (Bug prevention vs. bug detection or “Quality Assurance” vs “Testing”) </li></ul>
  31. 31. Telecom Product Verification Introduction Software testing – project management considerations
  32. 32. Telecom Product Verification Introduction <ul><li>Project Management Considerations: </li></ul><ul><li>Test Execution </li></ul><ul><li>Defect tracking </li></ul><ul><li>Bug scrubs and prioritization </li></ul><ul><li>Loadbuild frequency </li></ul><ul><li>Code coverage analysis </li></ul><ul><li>Location for test results </li></ul><ul><li>Metrics for test execution </li></ul><ul><li>Test phase entry and exit criteria </li></ul><ul><li>Test case execution time vs. test failure investigation time </li></ul><ul><li>Test plan / test case / test script updates </li></ul>
  33. 33. Telecom Product Verification Introduction <ul><li>Project Management Considerations </li></ul><ul><li>There is overlap in the test execution roles, but at some point, the software must be tested by a software tester, not the software developer </li></ul><ul><li>Software Development vs. Software Testing </li></ul><ul><li>Functional Perspective – “Does it work?” </li></ul><ul><li>vs. </li></ul><ul><li>Vulnerability Perspective – “Can it be broken?” </li></ul><ul><li>The job of the software tester is to want the software to fail, expect it to fail, and to find its failures. (adapted from Kaner, Falk & Nguyen) </li></ul>
  34. 34. Telecom Product Verification Introduction <ul><li>Project Management Considerations </li></ul><ul><li>Software Testing can be performed in parallel (concurrently) with software development, or it can be performed serially after software development. </li></ul><ul><li>Best approach is a combination of both: testing starts when coding starts, more testing people added when code design is complete. Automated tests are built as software is built. (more about test automation later) </li></ul><ul><li>Result is quicker detection and correction of bugs, for this release and future releases. Goal is always to find bugs earlier in the development process. </li></ul>
  35. 35. Telecom Product Verification Introduction <ul><li>Project Management Considerations </li></ul><ul><li>When making a trade-off decision between software testing depth vs. testing breadth, remember it is better to have some information about all features, rather than no information about some features. </li></ul><ul><li>“ One percent of the bugs in Microsoft's software cause half of all reported errors with 20 percent of bugs responsible for 80 percent of the mistakes,” Microsoft Chief Executive Steve Ballmer, Oct 3, 2002 </li></ul>
  36. 36. Telecom Product Verification Introduction <ul><li>Project Management Considerations </li></ul><ul><li>Which is better, to verify that all features work as intended (positive test cases) or to verify all failure scenarios (negative test cases)? </li></ul><ul><li>There are always more tests that could be performed. </li></ul><ul><li>Constantly re-evaluate your test plans to make sure you are doing the right tests. </li></ul>
  37. 37. Telecom Product Verification Introduction <ul><li>Project Management Considerations </li></ul><ul><li>No amount of testing can prove that the software is bug-free. Software testing can only prove that bugs exist. </li></ul><ul><li>Q&A </li></ul>
  38. 38. Telecom Product Verification Introduction <ul><li>Resources </li></ul><ul><li>Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen </li></ul><ul><li>Black-Box Testing: Techniques for Functional Testing of Software and Systems, Boris Beizer, Wiley, 1995 </li></ul><ul><li>Managing the Testing Process, Rex Black </li></ul><ul><li>Classic Testing Mistakes, Brian Marick </li></ul><ul><li>http://www.testing.com/writings/classic/mistakes.pdf </li></ul><ul><li>Software QA / Test Resource Center </li></ul><ul><li>http:// www.softwareqatest.com/index.html </li></ul>
  39. 39. Spacer

×