Successfully reported this slideshow.



Published on

  • Be the first to comment


  1. 1. Towards Service-Oriented Testing of Web Services Hong Zhu Department of Computing, Oxford Brookes University Oxford OX33 1HX, UK Yufeng Zhang Dept of Computer Sci, National Univ. of Defense Tech., Changsha, China, Email: [email_address]
  2. 2. Overview <ul><li>Motivation </li></ul><ul><ul><li>The impact of WS on software testing </li></ul></ul><ul><ul><li>The requirements on supports to testing WS </li></ul></ul><ul><li>Proposed framework </li></ul><ul><li>Prototype implementation </li></ul><ul><li>Case studies </li></ul><ul><li>Conclusion </li></ul>
  3. 3. Characteristics of Web Services <ul><li>The components (services) of WS applications </li></ul><ul><ul><li>Autonomous : control their own resources and their own behaviours </li></ul></ul><ul><ul><li>Active : execution not triggered by message, and </li></ul></ul><ul><ul><li>Persistent : computational entities that last long time </li></ul></ul><ul><li>Interactions between services: </li></ul><ul><ul><li>Social ability : discover and establish interaction at runtime </li></ul></ul><ul><ul><li>Collaboration : as opposite to control, may refuse service, follow a complicated protocol, etc. </li></ul></ul>
  4. 4. WS technique stack <ul><li>Basic standards: </li></ul><ul><ul><li>WSDL: service description and publication </li></ul></ul><ul><ul><li>UDDI: for service registration and retrieval </li></ul></ul><ul><ul><li>SOAP for service invocation and delivery </li></ul></ul><ul><li>More advanced standards for collaborations between service providers and requesters. </li></ul><ul><ul><li>BPEL4WS: business process and workflow models. </li></ul></ul><ul><ul><li>OWL-S: ontology for the description of semantics of services </li></ul></ul>Registry Provider Requester Search for services registered services register service request service deliver service
  5. 5. Testing developer’s own services <ul><li>Service has similarity to test software components </li></ul><ul><ul><li>Many existing work on software component testing can be applied or adapted </li></ul></ul><ul><li>Requires special considerations: </li></ul><ul><ul><li>The stateless feature of HTTP protocol; </li></ul></ul><ul><ul><li>XML encoding of the data passing between services as in SOAP standard; </li></ul></ul><ul><ul><li>Confirmation to the published descriptions: </li></ul></ul><ul><ul><ul><li>WSDL for the syntax of the services </li></ul></ul></ul><ul><ul><ul><li>workflow specification in BPEL4WS </li></ul></ul></ul><ul><ul><ul><li>semantic specification in e.g. OWL-S. </li></ul></ul></ul>
  6. 6. Testing developer’s own services (continue) <ul><li>Dealing with requesters’ abnormal behaviours </li></ul><ul><ul><li>The requesters are autonomous, their behaviours may be unexpected </li></ul></ul><ul><ul><li>Need to ensure that the service handles abnormal behaviours properly </li></ul></ul><ul><li>Dealing with unexpected usages/loads </li></ul><ul><ul><li>As all web-based applications, load balance is essential. </li></ul></ul><ul><ul><li>The usage of a WS may not be available during the design and implementation of the system. </li></ul></ul><ul><li>Dealing with incomplete systems </li></ul><ul><ul><li>A service may have to rely on other services, thus hard to separate the testing of the own services from the integration testing, especially when it involves complicated workflows. </li></ul></ul><ul><ul><li>In the worst case, when dynamically bound to the other services </li></ul></ul>
  7. 7. Testing of others’ services in composition <ul><li>Some similarity to component integration </li></ul><ul><li>However, the differences are dominant </li></ul><ul><li>Problems in the application of existing integration testing techniques: </li></ul><ul><ul><li>Lack of software artifacts </li></ul></ul><ul><ul><li>Lack of control over test executions </li></ul></ul><ul><ul><li>Lack of means of observation on system behaviour </li></ul></ul>
  8. 8. Lack of software artifacts <ul><li>The problem: </li></ul><ul><li>No design documents, No source code, No executable code </li></ul><ul><li>The impacts: </li></ul><ul><ul><li>For statically bound services, </li></ul></ul><ul><ul><ul><li>Techniques that automatically derive stubs from source code are not applicable </li></ul></ul></ul><ul><ul><ul><li>Automatic instrumentation of original source code or executable code is not applicable </li></ul></ul></ul><ul><ul><li>For dynamic bound services, </li></ul></ul><ul><ul><ul><li>Human involvement in the integration becomes impossible. </li></ul></ul></ul><ul><li>Possible solutions: </li></ul><ul><ul><li>(a) Derive test harness from WS descriptions; </li></ul></ul><ul><ul><li>(b) The service provider to make the test stubs and drivers available for integration. </li></ul></ul>
  9. 9. Lack of control over test executions <ul><li>Problem: </li></ul><ul><ul><li>Services are typically located on a computer on the Internet that testers have no control over its execution. </li></ul></ul><ul><li>Impact: </li></ul><ul><ul><li>An invocation of the service as a test must be distinguished from a real request of the service. </li></ul></ul><ul><ul><li>System may be need to be restarted or put into a certain state to test it. </li></ul></ul><ul><ul><li>The situation could become much more complicated when a WS is simultaneously tested by many service requesters. </li></ul></ul><ul><li>Possible solution: </li></ul><ul><ul><li>The service provider must provide a mechanism and a service that enable service requesters to control the testing executions of the service. </li></ul></ul>Currently, there is no support to such mechanisms in W3C standards of WS.
  10. 10. Lack of means of observation <ul><li>The problem: </li></ul><ul><ul><li>A tester cannot observe the internal behaviours of the services </li></ul></ul><ul><li>The Impacts: </li></ul><ul><ul><li>No way to measure test coverage </li></ul></ul><ul><ul><li>No way to ensure internal state is correct </li></ul></ul><ul><li>Possible solutions: </li></ul><ul><ul><li>The service provider provides a mechanism and the services to the outside tester to observe its software’s internal behaviour in order to achieve the test adequacy that a service requester requires. </li></ul></ul><ul><ul><li>The service provider opens its document, source code as well as other software artifacts that are necessary for testing to some trusted test service providers. </li></ul></ul>
  11. 11. The proposed approach <ul><li>A WS should be accompanied by a testing service. </li></ul><ul><ul><li>functional services : the services of the original functionality </li></ul></ul><ul><ul><li>testing services : the services to enable test the functional services </li></ul></ul><ul><li>Testing services can be either provided by the same vendor of the functional services, or by a third party. </li></ul>
  12. 12. Architecture of service oriented testing Broker T-services of A 1 F-services of A 1 Matchmaker UDDI Registry GUI F-services of Tester T 1 T-services of Tester T 1 F-services of Tester T 2 T-services of Tester T 2 T-services of A 2 F-services of A 2 Ontology management
  13. 13. A Typical Scenario: Car Insurance Broker CIB’s F-Services Bank B’s F-Services Insurance A 1 ’s F-Services Insurance A 2 ’s F-Services Insurance A n ’s F-Services GUI Interface WS Registry CIB’s service requester Bank B’s T-Services Insurance A 1 ’s T-Services Insurance A 2 ’s T-Services Insurance A n ’s T-Services CIB’s T-Services Tester T 1 F-Services Tester T 1 T-Services Test Broker F-Services Test Broker T-Services Tester T 2 F-Services Tester T 2 T-Services
  14. 14. How does it work? <ul><li>Suppose the car insurance broker want to search for web services of insurers and test the web service before making quote for its customers. </li></ul>Car Insurance Broker CIB Insurer Web Service IS customer Information about the car and the user Insurance quotes Testing the integration of two services
  15. 15. Car Insurance Broker CIB Insurer Web Service IS ( F - Service ) Test Broker : TB Insurer Web Service : IS ( T - Service ) Matchmaker Testing Service : TG ( Test case Generator ) Testing Service : TE ( Test Executor ) . 0. Intended composition of services Test Broker . 3.Request test service 8 . Testing related meta - data 16 . Test report 7 . Request service meta - data 12 . Testing related meta - data 9 . Test case 6 . Request test service 2 Register service 10 . Request test service 15 . Test results 1 Register service 5 . List of testers 13 . Test invocation of services 14 . Results of test invocation of services 4. Search for testers 11 . Request service meta - data
  16. 16. Key Issues to Automate Test Services <ul><li>How a testing service should be described, published and registered at WS registry; </li></ul><ul><li>How a testing service can be searched and retrieved automatically even for testing dynamically bound services; </li></ul><ul><li>How a testing service can be invoked by both a human tester and a program to dynamically discover a service and then test it before bind to it. </li></ul><ul><li>How testing results can be summarized and reported in the forms that are suitable for both human beings to read and machine to understand. </li></ul>These issues can be resolved by the utilization of a software testing ontology (Zhu & Huo 2003, 2005).
  17. 17. STOWS: Software Testing Ontology for WS <ul><li>Ontology defines the basic terms and relations comprising the vocabulary of a topic area as well as the rules for combining them to define extensions to the vocabulary </li></ul><ul><li>STOWS is base on an ontology of software testing originally developed for agent oriented software testing (Zhu & Huo 2003, 2005). </li></ul><ul><ul><li>The concepts of software testing are divided into two groups. </li></ul></ul><ul><ul><li>Knowledge about software testing are also represented as relations between concepts </li></ul></ul>
  18. 18. STOWS (1): Basic concepts <ul><li>Tester : a particular party who carries out a testing activity. </li></ul><ul><li>Activity : consists of actions performed in testing process, including test planning , test case generation , test execution , result validation , adequacy measurement and test report generation , etc. </li></ul><ul><li>Artefact : the files, data, program code and documents etc. inovlved in testing activities. An Artefact possesses an attribute Location expressed by a URL or a URI. </li></ul><ul><li>Method : the method used to perform a test activity . Test methods can be classified in a number of different ways. </li></ul><ul><li>Context : the context in which testing activities may occur in software development stages to achieve various testing purposes. Testing contexts typically include unit testing, integration testing, system testing, regression testing, etc. </li></ul><ul><li>Environment . The testing environment is the hardware and software configurations in which a testing is to be performed. </li></ul>
  19. 19. STOWS (2): Compound concepts <ul><li>Capability : describes what a tester can do </li></ul><ul><li>the activities that a tester can perform </li></ul><ul><li>the context to perform the activity </li></ul><ul><li>the testing method used </li></ul><ul><li>the environment to perform the testing </li></ul><ul><li>the required resources (i.e. the input) </li></ul><ul><li>the output that the tester can generate </li></ul>Capability Method Activity Environment Context Capability Data Artefact <<enumeration>> Capability Data Type Input Output 1 1 0-1 0-1 0-* 1-*
  20. 20. <ul><li>Task : describes what testing service is requested </li></ul><ul><li>A testing activity to be performed </li></ul><ul><li>How the activity is to be performed: </li></ul><ul><ul><li>the context </li></ul></ul><ul><ul><li>the testing method to be used </li></ul></ul><ul><ul><li>the environment in which the activity must be carried out </li></ul></ul><ul><ul><li>the available resources </li></ul></ul><ul><ul><li>the expected outcomes </li></ul></ul>Task Method Activity Environment Context Task Data Artefact <<enumeration>> Task Data Type Input Output 0-1 0-1 1 1 1-* 1-*
  21. 21. STOWS (3): Relations between concepts <ul><li>Relationships between concepts are a very important part of the knowledge of software testing: </li></ul><ul><ul><li>Subsumption relation between testing methods </li></ul></ul><ul><ul><li>Compatibility between artefacts’ formats </li></ul></ul><ul><ul><li>Enhancement relation between environments </li></ul></ul><ul><ul><li>Inclusion relation between test activities </li></ul></ul><ul><ul><li>Temporal ordering between test activities </li></ul></ul><ul><li>How such knowledge is used: </li></ul><ul><ul><li>Instances of basic relations are stored in a knowledge-base as basic facts </li></ul></ul><ul><ul><li>Used by the testing broker to search for test services through compound relations </li></ul></ul>
  22. 22. Compound relations <ul><li>MorePowerful relation: between two capabilities. </li></ul><ul><ul><li>MorePowerful ( c 1 , c 2 ) means that a tester has capability c 1 implies that the tester can do all the tasks that can be done by a tester who has capability c 2 . </li></ul></ul><ul><li>Contains relation: between two tasks. </li></ul><ul><ul><li>Contains ( t 1 , t 2 ) means that accomplishing task t 1 implies accomplishing t 2 . </li></ul></ul><ul><li>Matches relation: between a capability and a task. </li></ul><ul><ul><li>Match ( c , t ) means that a tester with capability c can fulfil the task t . </li></ul></ul>
  23. 23. Prototype Implementation <ul><li>STOWS is represented in OWL-S </li></ul><ul><ul><li>Basic concepts as XML data definition </li></ul></ul><ul><ul><li>Compound concepts defined as service profile </li></ul></ul><ul><li>UDDI /OWL-S registry server (as the test broker): </li></ul><ul><ul><li>Using OWL-S/UDDI Matchmaker </li></ul></ul><ul><ul><li>The environment: </li></ul></ul><ul><ul><ul><li>Windows XP, </li></ul></ul></ul><ul><ul><ul><li>Intel Core Duo CPU 2.16GHz, </li></ul></ul></ul><ul><ul><ul><li>Jdk 1.5, Tomcat 5.5 and Mysql 5.0. </li></ul></ul></ul>
  24. 24. Case Study: <ul><li>An automated software testing tool CASCAT is wrapped into a test service </li></ul><ul><ul><li>Registered: </li></ul></ul><ul><ul><ul><li>Capability is described in the ontology represented in OWL-S </li></ul></ul></ul><ul><ul><li>Searchable: </li></ul></ul><ul><ul><ul><li>It can be searched when the testing task matches its capability </li></ul></ul></ul><ul><ul><li>Invoked through the internet </li></ul></ul><ul><ul><ul><li>As a web services to generation test cases based on algebraic specification </li></ul></ul></ul><ul><li>A web service and its corresponding test service are implemented </li></ul><ul><ul><li>Both registered </li></ul></ul><ul><ul><li>Testing of the WS can be invoked through the corresponding T-Service </li></ul></ul>
  25. 25. Conclusion <ul><li>Challenges to testing web services applications </li></ul><ul><ul><li>Testing a web service as developers’ own software </li></ul></ul><ul><ul><li>Integration testing at development and at run-time </li></ul></ul><ul><ul><ul><li>No support in current WS standard stack </li></ul></ul></ul><ul><li>A service oriented approach is proposed </li></ul><ul><ul><li>Architecture fits well into service oriented architecture </li></ul></ul><ul><ul><li>Supported by software testing ontology </li></ul></ul><ul><li>Feasibility of the approach tested via a case study. </li></ul>
  26. 26. Advantages <ul><li>Automated process to meet the requirements of on-the-fly service integration testing </li></ul><ul><ul><li>Automation without human involvement </li></ul></ul><ul><ul><li>Testing without interference to providing normal functional services </li></ul></ul><ul><ul><li>Testing without affect the real world state </li></ul></ul><ul><li>Security and IPR can be managed through a certification and authentication mechanism for third party specialised testing services </li></ul><ul><li>Business opportunities for testing tool vendors and software testing companies to provide testing services online as web services </li></ul>
  27. 27. Remaining challenges and future work <ul><li>Technical challenges </li></ul><ul><ul><li>To develop a complete ontology of software testing (e.g. the formats of many different representations of testing related artefacts) </li></ul></ul><ul><ul><li>To implement the test brokers efficiently </li></ul></ul><ul><ul><li>To device the mechanism of certification and authentication for testing services </li></ul></ul><ul><li>Social challenges </li></ul><ul><ul><li>For the above approach to be practically useful, it must be adopted by web service developers, testing tool vendors and software testing companies </li></ul></ul><ul><ul><li>Need standards, such as a standard of software testing ontology </li></ul></ul>
  28. 28. References <ul><li>Zhang, Y. and Zhu, H., Ontology for Service Oriented Testing of Web Services , Proc. of The Fourth IEEE International Symposium on Service-Oriented System Engineering (SOSE 2008) , Dec. 18-19, 2008, Taiwan. In press. </li></ul><ul><li>Zhu, H., A Framework for Service-Oriented Testing of Web Services , Proc. of COMPSAC’06, Sept. 2006, pp679-691. </li></ul><ul><li>Zhu, H. and Huo, Q., Developing A Software Testing Ontology in UML for A Software Growth Environment of Web-Based Applications , Chapter IX: Software Evolution with UML and XML, Hongji Yang (ed.). IDEA Group Inc. 2005, pp263-295. </li></ul><ul><li>Zhu, H. Cooperative Agent Approach to Quality Assurance and Testing Web Software , Proc. of QATWBA’04/COMPSAC’04, Sept. 2004, IEEE CS, Hong Kong,pp110-113. </li></ul><ul><li>Zhu, H., Huo, Q. and Greenwood, S., A Multi-Agent Software Environment for Testing Web-based Applications , Proc. of COMPSAC’03, 2003, pp210-215. </li></ul>