S-Cube Learning Package         Service Discovery and Identification:      Service Discovery and Task Models             C...
Learning Package Categorization                          S-Cube    Engineering Principles, Techniques and Methodologies   ...
Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
Problem description Established context factors such as time and location have  been applied to the design of Service-bas...
Learning Package Overview Problem description Task Modeling Task-based service discovery Discussion Conclusions
Task modeling Concepts Task modeling is the processing of known or inferred data  about a user task into graphical, struc...
Task Modeling Notations Several task modeling notations exist including Hierarchical task  Analysis (HTA); Goals, Operato...
Task-based service discovery   Two-stage approach to task-based service discovery:    – User task model specification    ...
1 - User Task Model Specification• The user task model defines a  reusable task structure that  encapsulates well-defined ...
User Task and Goal Description Each user task is specified with natural language descriptions  of the task in context and...
CTT model (1) The user task is then modeled using the CTT formalism.Background: CTT notation.This task modeling notation ...
CTT model (2)Background: CTT Task Types Application task: entirely executed by the system (e.g.  display text) User task...
CTT model (3)Background: CTT temporal operators Enabling                            T1 >> T2 Enabling with information p...
CTT model (4)Background: developing a CTT Decompose tasks into subtasks Identify the temporal relationships between task...
CTT model – Calculate Distance example (1)Note: diagram drawn in the ConcurTaskTree Environment (CTTE)
CTT model – Calculate Distance example (2)Notes – CTT notation as applied in the example: Task hierarchy: the higher leve...
Association to service classes Each user task model associates each application subtask described in a  CTT model to one ...
Service class description Each service class associated with a user task model is  described using neutral terms sourced ...
2 – End user task-based service discovery
Task-based service discovery processBackground: original service discovery approach SeCSE service discovery environment a...
Task-based service discovery processBackground: full-EDDiE Full-EDDiE algorithm components:   – Natural Language Processi...
Task-based service discovery processBackground: EDDiE-lite The EDDiE-lite algorithm only implement 2 of the 4 EDDiE  comp...
Task based service discovery Task-Based algorithm TEDDiE extends the original EDDiE  algorithm with two additional steps ...
Task based service discoveryService discovery algorithm enhanced with user task knowledge
Task based service discovery TEDDiE algorithm components:  – Natural Language Processing  – Word Sense Disambiguation (fu...
Task based service discoveryRecap: activities associated with each discovery strategyTEDDiE                    Discovery a...
Task based service discovery example (1)  Consider the following initial service query:        The user sends a journey p...
Task based service discovery example (2)  The algorithm then matches the query consisting of both Q’   and Q’’ to the use...
Task based service discovery example (3)  Full-TEDDiE implements extend query reformulation that then   generates the fol...
Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
EvaluationEmpirical evaluations were performed to assess TEDDiE’s most effective strategies and the effect of adding user ...
Evaluation results Current evaluation results suggest that effective strategies  expand service queries with either user ...
Service classes service classes descriptions are pivotal to effective service  query reformulation and retrieval using u...
User task models No independent validation of the user task models was  undertaken for the empirical evaluation user tas...
Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
Summary User task models can be used to improve the discovery of services  that explicitly fit with the goals and constra...
Acknowledgements      The research leading to these results has      received funding from the European      Community’s S...
Upcoming SlideShare
Loading in …5
×

S-CUBE LP: Service Discovery and Task Models

682 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
682
On SlideShare
0
From Embeds
0
Number of Embeds
16
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

S-CUBE LP: Service Discovery and Task Models

  1. 1. S-Cube Learning Package Service Discovery and Identification: Service Discovery and Task Models City University London (CITY)Konstantinos Zachos, Angela Kounkou, Neil Maiden, CITY www.s-cube-network.eu
  2. 2. Learning Package Categorization S-Cube Engineering Principles, Techniques and Methodologies for Hybrid, Service-based Applications Service Discovery and Identification Service Discovery and Task Models
  3. 3. Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
  4. 4. Problem description Established context factors such as time and location have been applied to the design of Service-based applications (SBAs). User tasks however are an often overlooked factor in the engineering of SBAs Task Modeling is an established research topic in the field of Human-Computer Interaction (HCI). The models yielded can help run-time service environments to: – Select, compose and invoke services that explicitly fit with the goals and constraints of the user task – Overcome mismatches between user requests and descriptions of the software services to meet these requests We develop, codify and apply user task models to a service discovery environment for the discovery of best-fit services for a user task.
  5. 5. Learning Package Overview Problem description Task Modeling Task-based service discovery Discussion Conclusions
  6. 6. Task modeling Concepts Task modeling is the processing of known or inferred data about a user task into graphical, structured representations of user task knowledge: user task models. The aim is to understand what the users want to achieve and the activities performed in order to bring about the desired state. A user goal is a system user’s end result to be achieved Tasks are the activities that must be performed to achieve a user goal Actions are simple tasks that cannot be decomposed into sub-tasks
  7. 7. Task Modeling Notations Several task modeling notations exist including Hierarchical task Analysis (HTA); Goals, Operators, Methods, Selection rules (GOMS); and ConcurTaskTrees (CTT). We represent the user task models in our approach using the CTT task modeling formalism as it adopts an engineering approach to user task models, and its more complete and precise semantics can support automated service discovery more effectively than other user task modeling formalisms. CTT resources:F. Paterno, C. Mancini, S. Meniconi, ConcurTaskTrees: A Diagrammatic Notation for Specifying Task Models, Proceedings of the IFIP TC13 International Conference on Human-Computer Interaction, pp. 362-369, 1997D. Sinnig, M. Wurdel, P. Forbrig, P. Chalin, and F. Khendek, Practical Extensions for Task Models, Proceedings of the Sixth International Workshop on TAsk MOdels and DIAgrams (TAMODIA07), Lecture Notes in Computer Science Vol. 4849, Springer, Toulouse, France, 2007
  8. 8. Task-based service discovery Two-stage approach to task-based service discovery: – User task model specification – Discovery process
  9. 9. 1 - User Task Model Specification• The user task model defines a reusable task structure that encapsulates well-defined functionality for a recurrent design problem• Key elements of our user task models’ schema include: • User task and goal description • CTT task model • Service class description
  10. 10. User Task and Goal Description Each user task is specified with natural language descriptions of the task in context and the associated user goal. Example: task and associated user goal for the user task Calculate a distance Name Calculate distance Task Compute and output the distance between two specified geographical locations “A” and “B” Resources Coordinates for the geographical locations
  11. 11. CTT model (1) The user task is then modeled using the CTT formalism.Background: CTT notation.This task modeling notation possesses: A Hierarchical tree structure that decomposes higher level tasks into lower level subtask(s) that execute it. Types graphically depicting a task’s performance allocation: user task, application task, interaction task (performed by both a human actor and a system interacting) and abstract task (complex tasks that do not fall into either of the other categories). Temporal operators describing the temporal relationships between tasks at a same hierarchical level
  12. 12. CTT model (2)Background: CTT Task Types Application task: entirely executed by the system (e.g. display text) User task: performed entirely by the user (e.g. read text) Interaction task: performed by user interactions with the system (e.g. edit text) Abstract task: require complex actions and do not completely fall into one of the previous categories
  13. 13. CTT model (3)Background: CTT temporal operators Enabling T1 >> T2 Enabling with information passing T1 [ ]>> T2 Disabling T1 [> T2 Interruption T1 |> T2 Choice T1 [ ] T2 Concurrency T1 ||| T2 Optionality [T] Iteration T1* or T1{n}
  14. 14. CTT model (4)Background: developing a CTT Decompose tasks into subtasks Identify the temporal relationships between tasks at the same level of the hierarchy Identify objects (entities manipulated to perform tasks) and the related associated actions
  15. 15. CTT model – Calculate Distance example (1)Note: diagram drawn in the ConcurTaskTree Environment (CTTE)
  16. 16. CTT model – Calculate Distance example (2)Notes – CTT notation as applied in the example: Task hierarchy: the higher level task Calculate distance is decomposed into lower level subtasks that execute it: Input start, Enter destination, Submit data, Validate data, Compute distance, Display distance and View distance. The subtasks themselves can be decomposed further (e.g. Input start and Validate data) Types: graphical syntax elements indicate each of the tree nodes’ types; for instance interaction tasks ( )comprise Enter destination, Submit data, View distance, Accept current location and Enter start. Temporal operators: the relationships between tasks at a same hierarchical level and their occurrence in time are described. For example, Enter destination enables and passes on information (here, the destination elicited) to Submit data, which in turn enables and inform the subtask Validate data as described by the operator []>>
  17. 17. Association to service classes Each user task model associates each application subtask described in a CTT model to one or more classes of software service. We associate service classes to user task models based on systematic analysis by people with domain expertise using the following steps: – systematically explore each pair-wise association between a user task and service class – make a design decision as to whether an application sub-task could be wholly or partially implemented using a software service of that class – Create an association between the sub-task and service class if enhancement was agreed to take place. Returning to our example Calculate Distance with the application sub-task Validate data, we can associate it to services of the class DataValidation because one or more such services could be reasonably invoked by a service-based application.
  18. 18. Service class description Each service class associated with a user task model is described using neutral terms sourced from online encyclopedias to avoid unintended bias during service discovery. Continuing the DataValidation example, the present table reports the DataValidation class’ s functional description taken from the source concept definition, and a list of operations derived from action terms or verbs in the concept description.
  19. 19. 2 – End user task-based service discovery
  20. 20. Task-based service discovery processBackground: original service discovery approach SeCSE service discovery environment as the platform upon which we design and implement the S-Cube approach Expansion and Disambiguation Discovery Engine (EDDiE) formulates service queries from use case and requirements specifications expressed in structured natural language EDDiE can be configured to formulate the queries using: – information retrieval techniques (full – EDDiE) – keyword matching techniques (EDDiE-lite). More information on EDDiE:K. Zachos, N.A.M. Maiden, S. Jones, X. Zhu, Discovering Web Services To Specify More Complete System Requirements,Proc. 19th Conference on Advanced Information System Engineering (CAiSE07), pp.142-157, 2007.
  21. 21. Task-based service discovery processBackground: full-EDDiE Full-EDDiE algorithm components: – Natural Language Processing: the service query is divided into sentences, tokenized, and part-of-speech tagged and modified to include each term’s morphological root (e.g. driving to drive; drivers to driver). – Word Sense Disambiguation: disambiguate each term by defining its correct sense and tagging it with that sense (e.g. defining a driver to be a vehicle rather than a type of golf club). – Query Expansion: expand each term with other terms that have similar meaning to increase the likelihood of a match with a service description (e.g. driver is synonymous with motorist which is also then included in the query). – Service Matching: match all expanded and sense-tagged query terms to a similar set of terms that describe each candidate service, expressed using a service description.
  22. 22. Task-based service discovery processBackground: EDDiE-lite The EDDiE-lite algorithm only implement 2 of the 4 EDDiE components: Natural Language Processing and Service Matching Full-EDDiE undertakes service discovery with term expansion and EDDiE-Lite undertakes it with no term expansion. Discovery Activities EDDiE-Lite Full-EDDiE Natural Language Processing ✔ ✔ Word Sense Disambigation ✔ Query Expansion ✔ Service Matcher ✔ ✔
  23. 23. Task based service discovery Task-Based algorithm TEDDiE extends the original EDDiE algorithm with two additional steps to discover services by matching terms describing the user problem to user task models that, in turn, are used to reformulate the service queries. Both versions of EDDiE are extended with a catalogue of class-level user task models that link application sub-tasks to classes of service solution. As with EDDiE, there exist two configurations of TEDDiE: – Full TEDDiE use information retrieval techniques to add domain knowledge to service queries – TEDDiE-lite use simple keyword-matching
  24. 24. Task based service discoveryService discovery algorithm enhanced with user task knowledge
  25. 25. Task based service discovery TEDDiE algorithm components: – Natural Language Processing – Word Sense Disambiguation (full-EDDiE only) – Query Expansion (full-EDDiE only) – Query Matching to User Task Models: matches a service query to the natural language description of each user task model in the catalogue, resulting in an ordered set of retrieved user task models – Query Reformulation: extracts terms from the descriptions of service classes associated with each application sub-task for retrieved user task model to generate new service queries. The reformulation implements two activities: (i) extend the original service query with the new terms extracted from the user task model; (ii) replace the service query with the new terms extracted from the user task model. – Service Matching
  26. 26. Task based service discoveryRecap: activities associated with each discovery strategyTEDDiE Discovery activities Strategies step No term Expansion Term Expansion TEDDiE-Lite Full-TEDDiE Replace Extend Replace Extend 1 Natural language processing of original service query ✔ ✔ ✔ ✔ 1 Word sense disambiguation of original service query ✔ ✔ 1 Query expansion of original service query ✔ ✔ 1 Match query to user task models ✔ ✔ ✔ ✔ 2 Natural language processing of each service class ✔ ✔ ✔ ✔ description 2 Word sense disambiguation of terms of each service class ✔ ✔ description 2 Query expansion of terms of each service class description ✔ ✔ 2 Reformulating service queries by replacing terms ✔ ✔ 2 Reformulating service queries by extending terms ✔ ✔ 2 Service matching with reformulated service queries ✔ ✔ ✔ ✔
  27. 27. Task based service discovery example (1)  Consider the following initial service query: The user sends a journey planning request with details about the start, end point and travel preferences for his journey.  Full-TEDDiE extracts query terms from the original service query, i.e. Q’ = [journey, travel, preference, user, start, end point, plan, send],  The algorithm generates new query terms after the application of term expansion, e.g.: Q’’ = [termination, commence, direction, move, place, go].
  28. 28. Task based service discovery example (2)  The algorithm then matches the query consisting of both Q’ and Q’’ to the user task models catalogue. Assume that one retrieved models is Calculate a distance, with an associated service class Route planning software: Route planning software is a computer software programme, designed to plan a (optimal) route between two geographical locations using a journey planning engine, typically specialised for road networks as a road route planner. It can typically provide a list of places one will pass by, with crossroads and directions that must be followed, road numbers, distances, etc. It also usually provides an interactive map with a suggested route marked on it.
  29. 29. Task based service discovery example (3)  Full-TEDDiE implements extend query reformulation that then generates the following original and expanded service class terms: S’ = [direction, crossroads, location, distance, road, map, suggest] S’’ = [way, itinerary, calculation, travel by, path, motor, travel]  Finally, with this service class information TEDDiE generates and fires a reformulated service query Q based on the extend query reformulation such that Q = {Q’, Q’’, S’, S’’} Q = {[journey, travel, preference, user, start, end point, plan, send], [termination, commence, direction, move, place, go], [direction, crossroads, location, distance, road, map, suggest], [way, itinerary, calculation, travel by, path, motor, travel]}.
  30. 30. Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
  31. 31. EvaluationEmpirical evaluations were performed to assess TEDDiE’s most effective strategies and the effect of adding user task knowledge to service queries: a set of user task models were developed and extended with knowledge about classes of software service in the e-government domain human expert classified services in a target service registry that a series of pre-defined service queries should retrieve as relevant service queries were fired at the target service registry results were collected and analysed to investigate the algorithms’ performance – statistical measures used included the computed arithmetic means of the totals of relevant and irrelevant retrieved services and the precision, recall and balanced F-score measures of generated service queries
  32. 32. Evaluation results Current evaluation results suggest that effective strategies expand service queries with either user task or domain knowledge: – extending rather than replacing service queries with additional knowledge about user tasks improve the overall effectiveness of task- based service discovery – query reformulation with user tasks knowledge increase the number of relevant services retrieved by a service discovery only when comparing the lite versions of both algorithms (EDDiE and TEDDiE), but not the full ones – query reformulation with user tasks knowledge improve the overall correctness of retrieved services only when comparing the lite versions of both algorithms, but not the full ones – query reformulation with user tasks knowledge does not decrease the number of irrelevant services retrieved by a service discovery engine
  33. 33. Service classes service classes descriptions are pivotal to effective service query reformulation and retrieval using unaltered text from publicly-available encyclopedias is an arguably weak approach to describing service classes associated to application sub-tasks TEDDiE’s precision, recall and balanced F-score measures may be increased through an improved description of service classes in user task models, possibly through: – more thorough specification of service classes description terms – use of more formal languages and ontologies with which to describe service classes.
  34. 34. User task models No independent validation of the user task models was undertaken for the empirical evaluation user task models specification may be extended by exploiting other user task model semantics, such as resource descriptions, in order to enrich service queries and temporal associations between sub-tasks. User task models’ development effort: – Generating and codifying user task knowledge in models has an associated cost not incurred with available on-line thesauri – service queries are not restricted to tasks that instantiate the class- level user task models in a catalogue. – User task models need to offer more explicit support in order to offset their development cost.
  35. 35. Learning Package Overview Problem description Task modeling Task-based service discovery Discussion Conclusions
  36. 36. Summary User task models can be used to improve the discovery of services that explicitly fit with the goals and constraints of a user task Our approach to task-based service discovery involves the population of an online catalogue with extended user task models to support a discovery process comprising: – Natural Language Processing – Word Sense Disambiguation – Query Expansion – Query Matching to User Task Models – Query Reformulation – Service Matching Current results indicate research directions to refine the TEDDiE algorithm for improved performance over non-task-based discovery
  37. 37. Acknowledgements The research leading to these results has received funding from the European Community’s Seventh Framework Programme [FP7/2007-2013] under grant agreement 215483 (S-Cube).

×