Sidharth Kumar

613 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
613
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
9
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Sidharth Kumar

  1. 1. Towards Target-Level Testing and Debugging Tools For Embedded Software Harry Koehnemann, Arizona State University Dr. Timothy Lindquist, Arizona State University <ul><li>Presented By: </li></ul><ul><ul><li>Sidharth Kumar </li></ul></ul><ul><ul><li> CSCI 589 </li></ul></ul><ul><ul><li>October 9th, 2007 </li></ul></ul>
  2. 2. Summary of Paper <ul><ul><li>Introduction </li></ul></ul><ul><ul><li>Software Testing </li></ul></ul><ul><ul><li>Embedded Testing </li></ul></ul><ul><ul><li>Problems with Embedded Testing </li></ul></ul><ul><ul><li>Increasing Target Functionality </li></ul></ul><ul><ul><li>Architectural additions </li></ul></ul><ul><ul><li>Run-Time System Additions </li></ul></ul><ul><ul><li>Strengths of the Paper </li></ul></ul><ul><ul><li>Weaknesses of the Paper </li></ul></ul><ul><ul><li>Conclusions </li></ul></ul>
  3. 3. The Race Begins….
  4. 4. 2 2
  5. 5. Introduction <ul><li>Software Validation & Testing </li></ul><ul><li>Goal: To Identify and address the problems associated with test case execution for Embedded Systems </li></ul><ul><li>To propose solutions for making embedded testing more effective at revealing errors. </li></ul>
  6. 7. Introduction – Cont. <ul><li>Testing: </li></ul><ul><li>Executing a piece of software in order to reveal errors </li></ul><ul><li>Development of test cases, generation and execution </li></ul><ul><li>Debugging: </li></ul><ul><li>Locating and correcting the cause of an error once it has been revealed </li></ul><ul><li>Developer must recreate exact execution scenario </li></ul><ul><li>All environmental variants must be accounted for </li></ul>
  7. 9. Introduction – Cont. <ul><li>Constraints: </li></ul><ul><ul><li>Concurrent Designs </li></ul></ul><ul><ul><li>Real-time constraints </li></ul></ul><ul><ul><li>Embedded target environments </li></ul></ul><ul><ul><li>Distributed hardware architectures </li></ul></ul><ul><ul><li>Device control dependencies </li></ul></ul><ul><li>Restriction of execution visibility and control </li></ul><ul><li>Target environment shortcomings </li></ul>Embedded Systems
  8. 10. Software Testing <ul><li>Module level testing </li></ul><ul><li>Integration testing </li></ul><ul><li>System testing </li></ul><ul><li>Hardware/Software Integration testing </li></ul><ul><li>The fourth phase is unique to embedded systems. </li></ul>“ If you can't make it good, at least make it look good.” – Bill Gates
  9. 11. <ul><li>Testing Concurrent Systems </li></ul><ul><li>Concurrency increases the difficulty of software testing </li></ul><ul><li>Unmanageably large set of legal execution sequences that a concurrent program may take </li></ul><ul><li>Subsequent execution with same input may yield different-yet correct results </li></ul><ul><li>Non-intrusive testing </li></ul><ul><li>Embedded applications have strict timing requirements </li></ul><ul><li>No intrusions on a test execution </li></ul><ul><li>Instrumentation </li></ul><ul><li>Real time Embedded tools: ROM monitors, Emulators, Bus Monitors </li></ul>Software Testing – Cont.
  10. 12. Impact of the Underlying System <ul><li>Dealing with Abstraction </li></ul><ul><li>Task objects enable high level communication </li></ul><ul><li>With abstraction, implementation details become buried </li></ul>Software Testing – Cont. Language Constructs Hardware Time Complexity
  11. 13. Embedded Testing <ul><li>Developed on custom hardware configurations, </li></ul><ul><li>Tools and techniques of one not applicable on another </li></ul><ul><li>Ad Hoc approach to integration and system testing </li></ul><ul><li>Environments: Host and the Target. Target has little support for software development tools. </li></ul>
  12. 14. Embedded Testing – Cont. <ul><li>Current state of Embedded Testing </li></ul><ul><ul><li>Incorrect handling of interrupts </li></ul></ul><ul><ul><li>Distributed communication problems </li></ul></ul><ul><ul><li>Incorrect ordering of concurrent events </li></ul></ul><ul><ul><li>Resource contention </li></ul></ul><ul><ul><li>Incorrect use of device protocols and timing </li></ul></ul><ul><ul><li>Incorrect response to failures or transients </li></ul></ul>As space-system software grows in size and complexity, adequate testing becomes more difficult—and more critical.
  13. 15. Embedded Testing – Cont. <ul><li>Hardware Solutions: </li></ul><ul><li>Attempt to gain execution visibility and program control. </li></ul><ul><li>Bus monitors, ROM monitors, in-circuit emulators. </li></ul><ul><li>Minimal effectiveness for software development, Information gathering on low level machine data </li></ul><ul><li>Software Solutions: </li></ul><ul><li>Attempts to reduce costs of testing on target. </li></ul><ul><li>Factors: Level of criticality, Test platform availability, test classification etc. </li></ul><ul><li>Lead to extensive modifications and therefore hence extensive retesting </li></ul>Current Solutions
  14. 16. Problems with Embedded Testing <ul><li>Expense of testing process </li></ul><ul><li>Level of functionality on target </li></ul><ul><li>Late discovery of errors </li></ul><ul><li>Poor test selection criteria </li></ul><ul><li>Potential use in advancing architectures </li></ul>
  15. 18. ncreasing Target Functionality <ul><li>Tool support for embedded systems is Lacking. </li></ul><ul><li>Current approaches will soon be obsolete for future architectures </li></ul><ul><li>Add facilities to the underlying system to better support testing and debugging tools </li></ul><ul><li>Underlying system: Hardware architecture and RTS </li></ul>I
  16. 19. Increasing Target Functionality – Cont. Model Debugging System <ul><li>Symbol table information </li></ul><ul><li>At least two processes executing: Test program and the Debugger. </li></ul><ul><li>To realize non-intrusion: </li></ul><ul><ul><li>Execute code only at break point </li></ul></ul><ul><ul><li>Run debugger as a separate process </li></ul></ul><ul><ul><li>Provide separate execution unit to execute debugger </li></ul></ul>
  17. 20. Architectural Additions <ul><li>Hardware partitioning of memory: Each process should have its own protected address space that is not accessible to any other process, shared memory concept. </li></ul><ul><li>Computational facilities for Debugger: The internal debugger on the target processor runs as a regular process or architecture support be provided for execution. </li></ul><ul><li>Hardware Break Points: Software break points are intrusive and require more computation. Hardware Breakpoint Registers (BPR) checked by processor against the operands of each instruction. </li></ul>
  18. 21. <ul><li>Architectural Support for Abstractions: Abstractions to be incorporated into the instruction sets, architecture must be the basis for emulation capabilities and providing execution visibility. </li></ul><ul><li>Dedicated Bus : Interface that allows the processor to communicate with the external world without interfering with the system under test. </li></ul>Architectural Additions – Cont.
  19. 22. System Additions <ul><li>RTS requirements describe an interface between a tool and the underlying system: logical interface requiring substantial hardware support. </li></ul><ul><li>Goal is to minimize the data and computational requirements of the internal debugger as well as the required communication between the internal and external debuggers. </li></ul><ul><li>MRTSI and CIFO implement the abstractions. </li></ul>Architectural additions are costly in time and space -
  20. 23. <ul><li>The CIFO/MRTSI provide abstractions for: </li></ul>Run-Time System Additions – Cont. RTS additions are minimal <ul><ul><ul><ul><ul><li>Processes </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Interrupt Management </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Time Management </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Memory Management </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Exception/Fault Handling </li></ul></ul></ul></ul></ul>
  21. 24. Conclusions <ul><li>RTS and Architectural additions propose a solution to the problems in embedded system testing. </li></ul><ul><li>Testing and Debugging of embedded, real time software remains a “black art”, with ad hoc methods and techniques. </li></ul><ul><li>Evaluation all additions and determine their feasibility. </li></ul>
  22. 25. Strengths of Paper <ul><li>Focus on the need for detecting errors early </li></ul><ul><li>Improvement in architectural capabilities for better performance of Embedded Software </li></ul><ul><li>Emphasis on improving testing efficiency on target environment </li></ul>
  23. 26. Weakness of Paper <ul><li>Safety critical aspects on target environment </li></ul><ul><li>Additions improve but till what extent? </li></ul><ul><li>When to stop Target-Level testing ? </li></ul>
  24. 27. From the Author’s Desk….. <ul><li>No, we did not address anything specific to Safety Critical systems. Our goal was to open an Real-Time OS (RTOS) (really back then an Ada RTS) to provide testing and debugging tools control of the execution. What those tests were was irrelevant to us. Safety Critical analysis would produce interesting test scenarios that would require some target-level control to obtain the right system states, and might require something like we proposed in our paper. </li></ul><ul><li>Regarding b) The architecture additions came about from work I was doing at Intel at the time. In practice none of those ideas ever became reality. </li></ul><ul><li>Regarding c) I think we are talking about different types of systems. The embedded systems we describe are targeted for devices with little external visibility to their execution - embedded controllers, cell phones, etc. Traditional desktop software do not have these problems. Target for us means an embedded board with limited debugging capabilities. FYI, since that time, the embedded community has made great strides. The RTOS vendors have consolidated to 2 - Grean Hills/VxWorks - and both provide very good target level debugging capabilities. </li></ul><ul><li>Harry </li></ul>
  25. 28. References <ul><li>http://csse.usc.edu/classes/cs589_2007/ </li></ul><ul><li>http://www.aero.org/publications/crosslink/fall2005/06.html </li></ul><ul><li>http://www.embedded.com/2000/0011/0011feat1.htm </li></ul><ul><li>http://www.ece.cmu.edu/~koopman/des_s99/sw_testing/presentation.pdf </li></ul><ul><li>www.google.com </li></ul>
  26. 29. <ul><li>The Race Never Ends….. </li></ul>

×