Your SlideShare is downloading. ×
Software Testing and Reliability Testing Real-Time Systems
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Software Testing and Reliability Testing Real-Time Systems

1,466

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,466
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
34
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Software Testing and Reliability Testing Real-Time Systems Aditya P. Mathur Purdue University May 19-23, 2003 @ Guidant Corporation Minneapolis/St Paul, MN Last update: April 17, 2003 Graduate Assistants : Ramkumar Natarajan Baskar Sridharan
  • 2. Learning objectives
    • Issues in testing real-time systems
    • Tools for testing real-time systems
    • Methodology for testing real-time systems
  • 3. References
    • Testing Embedded Systems, Do You have the GuTs for it? , Vincent Encontre, Rational June 2001, http://www.rational.com/products/whitepapers/final_encontre_testing.jsp
    • CodeTest Vision for Embedded Software , Applied Microsystems Corporation, http://www.amc.com/products/codetest/.
    • G-Cover Object Code Analyzer , Green Hills Software Inc., http://www.ghs.com/products/safety_critical/gcover.html
    • DO-178B , http://www.rtca.org and http://www.windriver.com/products/html/do178b.html
  • 4. The RTCA DO-178B Guidelines
    • The guidelines in DO-178B impose constraints on the software development process so that the resulting system is safe.
    • The FAA’s DO-178B offers guidelines for the development of airborne systems equipment software.
    • Most RTOS tool vendors have accepted the guidelines in DO-178B and begun to offer tool support.
  • 5. Levels of Criticality-A, B
    • Level A: Software with anomalous behavior, as shown by the system safety assessment process, would cause or contribute to a failure of system function resulting in a catastrophic failure condition for the aircraft.
    • Level B: Software with anomalous behavior, as shown by the system safety assessment process, would cause or contribute to a failure of system function resulting in a hazardous/severe- major failure condition for the aircraft.
    Guidant products: Level A or B?
  • 6. Levels of Criticality-C, D, E
    • Level C: Software with anomalous behavior, as shown by the system safety assessment process, would cause or contribute to a failure of system function resulting in a major failure condition for the aircraft.
    • Level D: Software with anomalous behavior, as shown by the system safety assessment process, would cause or contribute to a failure of system function resulting in a minor failure condition for the aircraft.
    • Level E: Software with anomalous behavior, as shown by the system safety assessment process, would cause or contribute to a failure of system function with no effect on aircraft operational capability or pilot workload.
  • 7. Guidelines Related to Testing
    • Test Procedures are correct
    • Compliance with Software Verification Plan
    • Results are correct
    • Test coverage of High-Level requirements
    • Test coverage of Low-Level requirements
    • Test coverage of object code
    • Test coverage - Statement, Decision, and Modified condition/decision
    • Test coverage - Data coupling and control coupling
  • 8. Issues in Testing Real-Time Systems
    • Separation of development and execution platforms
    • Variety of execution platforms (combinations of hardware and OS, e.g. Z80/QNX, Tornado/PowerPC)
    • Tight resources and timing constraints
    • Emerging quality and certification standards
  • 9. Test Methodology: Unit Testing
    • Identify the module to be tested. This module becomes MuT.
    • Prepare MuT for testing.
    • Isolate the MuT from its environment.
    • Make MuT an independently executable module by adding a stub and a test driver .
    Replaces the rest of the application Module under test Generate tests stub MuT Test Driver Test harness
  • 10. Issues in Unit Testing
    • How to generate tests?
    • What to observe inside MuT?
    • Returned parameters
    • Internal state such as values of global variables, coverage, control flow for verification against a UML sequence diagram, etc.
    • How to observe?
    • Use coverage tools or debuggers
    • Insert probes
    • When to stop testing?
  • 11. Test Methodology: Integration Testing
    • Combine multiple MuT into a larger software component.
    • Build a test harness.
    • Look for the correctness of interactions amongst the various MuTs in the component. Again, UML sequence diagrams can be useful in validating the interactions against design.
  • 12. Test Methodology: System (Unit) Testing
    • The MuT is now a complete software application.
    • Test is executed under the RTOS.
    • Communication amongst individual modules might be via the RTOS.
    • The test driver could now be a simulator. The system needs to be brought to a desired initial state and then generate and send to the system a sequence of test messages.
    • Observation is done using probes. The data so collected is analyzed on the host platform.
    • Usually one does grey-box testing at this stage but more could be done.
  • 13. Test Methodology: System Integration Testing
    • Integrate other software components of the end-application. These might be off-the-shelf components.
    • Test is integrated system
  • 14. Test Methodology: System Validation Testing
    • For overall embedded system, perform the following:
    • Check if the system meets the end-user functional requirements.
    • Perform non-functional testing. This includes load testing, robustness testing, and performance testing.
    • Check for conformance with any inter-operability requirements.
  • 15. Run-time Analysis Source WindRiver and Agilent offer real-time trace tools that work with Agilent’s logic analyzers. Compiler Assembler/Linker Executable Target platform RTA server GUI Analysis tools
  • 16. Applicability of Test Techniques to Real-Time Systems
    • All techniques for test data generation and coverage analysis covered in this course are applicable when testing real-time applications.
    • Due to the embedded nature of such applications, one generally needs to use special tools that operate in a client-server mode; the server running on the host machine and the client on the target environment.
    • The client provides data gathered during the execution of the embedded application and the server analyses and presents it to the tester.
  • 17. Tools for Testing Real Time Systems
    • Rational: Test RealTime: code coverage; execution sequence, unit testing; integration testing, checking assertions for C++ classes; testing C threads, tasks, processes; evaluatin the UML model; requirement to test traceability; plug-in for mathworks Simulink.
    • Applied Microsystems: CodeTest: A suite of several tools; performance measurement; tracing of execution history; finds memory leaks; code coverage measurement; supports QNX .
    • Green Hills Software: G-Cover: Coverage analysis at the object code level; indirect source code coverage information.
    • WindRiver: RTA: Detects memory leaks; identifies hot spots; per task profiling.
  • 18. Summary
    • Issues in testing real-time systems
    • Test tool architecture
    • Test methodology

×