Validation of Service Oriented Computing DEVS Simulation Models

519 views
372 views

Published on

Presentation at the 2nd International Workshop on Model-driven Approaches for Simulation Engineering

(held within the SCS/IEEE Symposium on Theory of Modeling and Simulation part of SpringSim 2012)

Please see: http://www.sel.uniroma2.it/mod4sim12/ for further details

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
519
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Validation of Service Oriented Computing DEVS Simulation Models

  1. 1. Validation of Service OrientedComputing DEVS SimulationModels Hessam Sarjoughian and Mohammed Muqsith  Arizona Center for Integrative Modeling & Simulation  School of Computing, Informatics, and Decision Systems Engineering Dazhi Huang and Stephen Yau  Information Assurance Center  School of Computing, Informatics, and Decision Systems Engineering 1
  2. 2. Motivation A key promise for Service Based Software Systems is  On-demand Quality of Service (QoS) However, system design with QoS support is challenging  QoS depends on  System architecture  Interactions among constituent parts under a dynamic environment Voice Encryption Communication Service Service Software Software Hardware Hardware Server Server Link Hub Link Link Communication Application Application Communication Software Software Hardware Hardware Client Client 2
  3. 3. Motivation Design evaluations addressing QoS in Service Based Software Systems  Difficult to track & inflexible for experimentations  Small-scale system QoS can be predicted using analytical methods  Complex interactions in large-scale systems complicate QoS prediction Simulation, in contrast to real design and implementation  Offers alternate ways of understanding, development, and experimentation  Easier to configure system with repeatable experimentation capability  Early evaluation of system architecture  Simplify complex system design and evaluation  Validation is generally a necessity 3
  4. 4. Background Service Oriented Computing (SOC) Broker  A paradigm of computation Service  Based on the concept of service. 3  Services are software entities publish, subscribe, 1  Loosely coupled 2 discover  Publishable  Discoverable Service Call  Composable 4 Subscriber Publisher data service messages  Platform-independent Service Service Service Response 5 Service Oriented Architecture (SOA)  Concepts and principles toward building SOC systems  Software systems based on SOA are known as Service Based Software System (SBS) 4
  5. 5. SOC-DEVS Co-Design Modeling Methodology SOC-DEVS  Introduces SW/HW co-design modeling concept in SOA-Compliant DEVS (SOAD)  Provides flexible synthesis via assignment of software services to networked hardware.  Models service interaction through networked hardware SBS design SOA SW/HWSBS Co-Design Compliant Co-Design Service Service Based Flexible Partition Software Map System Hardware 5
  6. 6. SOC-DEVS : Component abstractions Software Layer  Hardware Layer  swService  Processor  Generic software layer  CPU  CPU cycles required for service  Operations constrained by execution hardware resource  Memory amount consumed  Broker, Publisher, Subscriber during service execution  SOA complaint service models  Transport Unit extend basic swService  Directs messages to/from swServices System Service Mapping  Interacts with lower layer  Provides flexible assignment  Network Interface Controller of services to processors  Transmits/ receives packets, frames  Queue/deque packets, frames SOA  Link Compliant Service  Models physical media  Interconnects multiple network switches Flexible Map  Network Switch, Router  Interconnects networks Hardware  Routes packets 6
  7. 7. SOC-DEVS: Service Interactions swService accounts for two basic aspects  Operations  Denotes functionality provided by the service  Communications  Denotes service to service interaction capability Models service to service interaction through hardware layer  Jobs  Job (cycles/sec, Mbytes of memory) represents computational load for operations  Messages  Message (MsgType, Size) represents communication load for communications Jobs CPU Operations Communications Transport Unit Messages NIC swService 1 Processor 1 SSM 7
  8. 8. SOC-DEVS: Networked Interactions swService accounts for two basic aspects  Operations – Denotes functionality provided by the service  Communications – Denotes service to service interaction capability Models service to service interaction through hardware layer  Jobs – Job (cycles/sec, Mbytes of memory) represents computational load for operations  Messages – Message (MsgType, Size) represents communication load for communications Jobs CPU Operations Messages Transport Unit Communications NIC Link swService  Processor M 1 SSM Router/Switch Operations CPU Communications Jobs Transport Unit Messages swService  NIC Link k Processor N
  9. 9. SOC-DEVS: Simulation Example Real Voice Communication  VCS Modeling in SOC-DEVS System  The real VCS is modeled  Streams End-to-End VoIP audio data to  Models End-to-End VoIP audio data with subscribers sampling rates and data encryption  Supports audio sampling rates and data  44.1 ~ 220.5 KHz encryption  256 Key DES encoding  44.1 ~ 220.5 KHz  0% or 100% encryption  256 Key DES encoding  Simulation testbed is configured with similar  0% or 100% encryption configurations as in real VCS  Supports multiple subscribers simultaneously Real Simulation System QoS is measured by the Category Table 1: System configuration   VCS throughput System System  Inter data frame delay Processor 2.2 GHz, 2.2 GHz, (CPU, Memory, 1024 MB, 1024MB, Network Card) 100 Mbps 100Mbps Network Link 100 Mbps 100Mbps Bandwidth 1-40, Subscriber # 1-40 100-1000 Data 60 sec 60 sec (logical Collection (wall clock) Duration clock) 9 Real System web services are developed in C# .NET
  10. 10. Testbed Supports experimentation, data collection and data analysis The testbed consists of  Real system  Voice Communication System Testbed  Support up to 40 simultaneous clients Real Simulation  Automated data collection System System mechanism  Throughput Data  Delay  Packet level tracing Data Analysis  Netmon 3.4 System  Simulation system  Voice Communication System Analysis Output  Arbitrary VCS configuration  Larger scale systems  DEVS-Suite simulator  Transducer based data collection  Data analysis system  MATLAB scripts 10
  11. 11. Round Trip Delay Definition delayserver processing delaynetwork 1 VoiceComm Network Client 2 RT (Round Trip ) delay  Client request sending event to first data arrival event  Consists of  Server processing delay  Network delay  DelayServer processing + 2xDelayNetwork  Measured at client end  ET2 – ET1 ET = Event Time
  12. 12. Inter Frame Time IFT3 IFT2 IFT1 FT4 FT3 FT2 FT1 VoiceComm Frame 4 Frame 3 Frame 2 Frame1 Inter Frame Time  Time interval between two consecutive audio frame events at the VoiceComm Service  Measured at server end  IFTK = FTK+1 - FTK  K ={1,…N}
  13. 13. Accuracy Delayserver processing Delaynetwork Network Client VoiceComm Client Accuracy  The ratio of Total Bytes Received w.r.t. Total Bytes Sent  A = TBR / TBS  Total Bytes Received (TBR)  Aggregated data bytes received by all the clients  TBR = ∑ BR (K) ; K ={1,2,…N} and denotes Client ID  Total Bytes Sent (TBS)  Aggregated data bytes sent by the VoiceComm service for all the clients  TBS = ∑ BS (K) K ={1,2,…N} and denotes Client ID
  14. 14. Experiment Scenario Client requests via network for audio data from the VoiceComm service VCS sampling rate  VCS  44.1-220.5 KHz  VCS buffer size  16K VoiceComm1 2 Network  Client number Client Audio Client  5-20 M1 Client 3 3 machines M1, M2, M5 M2 connected via network  M2 and M5 acts as clients using multiple threads Probe Point Client Client VoiceComm service sends data 5 Client for 60 seconds to each client M5 Data is collected at probe points  Each configuration has 10 runs  Data is averaged over these 10 runs
  15. 15. Real System Data Probe Points Probe Point (2,3,5) Probe Point (1) via NetMon at VoiceCommService UDP/IP NDIS Driver SC API’s NIC Driver SC Driver OS NetworkCard (NIC) Sound Card (SC) HW 15
  16. 16. Real System: Automated Data CollectionProcess Start Experiment Invoke/Request Service Start UDP/IP Data Start Audio Data Event Logging Output Event Logging UDP/IP Data Audio Data Event Logging Output Event Logging At Probe Point 2, 3 At Probe Point 1 • Software codeNo • Network packet layer No Service Stop ? Completed? • Windows Performance Yes Objects Yes Stop Audio Data Stop NetMon Output Event Logging 16
  17. 17. Results Time accuracy: mili-seconds 17
  18. 18. SOC-DEVS Simulator SOA-Compliant Service Models Service model mapping to hardware model Hardware Models SOC-DEVSSOA-Compliant Service Models SOA-DEVS (SOAD) + SW/HW Co-Design SOA + DEVS 18
  19. 19. Experimentation platform / Future Work SBS Experimentation Platform 19
  20. 20. Conclusion Developed an approach for validating SOC- DEVS (SW/HW co-design) simulation models  Automated real-time data collection  Voice Communication case study Services QoS depends on integrated software and hardware layers  Validation of Service-Based Software System simulations is a grand challenge, especially as the SW and HW interactions grow in complexity and scale http://devs-suitesim.sourceforge.net 20
  21. 21. Questions? Thank you 21
  22. 22. SOA-DEVS and SOC-DEVS: Contrasts1. 1. SOA-Compliant Service timing aspect is SOA-Compliant indirectly determined by Service Models Service Models the interactions with the hardware models. Service timing aspect is directly specified in the Service execution time, service models Jobs(in) Jobs(out) dt = job comletion time in CPU Service execution time, = TJobs(out) – TJobs(in) Hardware Models dt = mean delay +/- sigma 2.  Limited aspect of hardware representation ( only routing logic) 2.  Detailed representation of service as well as hardware models  Formal specification of sw/sw  Formal specification of sw/hw and sw/sw interaction interaction semantics semantics  No SW/HW separation and synthesis  Support SW/HW separation and synthesis Capability capability  Service models and their interactions  Both service and hardware models as well as their are specified with DEVS formalism interactions are specified with DEVS formalism SOA-DEVS SOC-DEVS 22

×