Your SlideShare is downloading. ×
  • Like
A Soap Performance Comparison Of Different WSRF Implementations
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

A Soap Performance Comparison Of Different WSRF Implementations

  • 1,197 views
Published

 

Published in Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,197
On SlideShare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
7
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Zwischen „6“ und „WSRFLite“ ein Leerzeichen benchmark groß Schreiben als erstes Wort zweiter Bulletpoint: eher sowas wie „benchmark results can indicate which implementation is favorable if performance is a key requirement“ (quasi betonen dass, wenn es um Performance geht, der Benchmark hilft, ansonsten aber nicht zwingend aussagekräftig ist“.
  • Ist ne gute Aufstellung. Zu 4 kannst du sagen dass sowas in Zukunft, aufbauend auf unseren Untersuchungen, auch interessant wäre. Ansonsten kannst du mal hier schauen, da gibt’s ein paar Zusammenfassungen: https://wickie.hlrs.de/staff/index.php/User:Hpckueb/Papers
  • Bullet 1: „based“ statt „base“ Bullet 4: Was willst du damit sagen? „Even though it‘s not standardized, it can make results comparable“?
  • „ Each service exposes 3 different types of operations:“ in bullet 2
  • - Kannst du dir merken für MIO: similiar to data that is, for example, used in complex fluid dynamics simulations - echoVoid to test latency of the SOAP stack (without processing overhead as in send/receive)
  • Kannst sowas sagen wie „GT4 under Linux is clearly an outlier, but repeated measurement brought the same value. As to now we cannot explain this drop in performance.“
  • Glaube hier stand im Paper, dass GT4 seinen Vorteil nicht mehr ausspielen kann (siehe send*, oder?), weil die deserialization so schlecht ist. Kannst du aus dem Paper nehmen und evtl. auch nur mündlich erwähnen

Transcript

  • 1. A SOAP Performance Comparison of different WSRF Implementations Roland Kübert, Axel Tenschert, Hai-Lang Thai {kuebert, tenschert}@hlrs.de High Performance Computing Center Stuttgart (HLRS), University of Stuttgart SOAP Comparison to WSRF 28.11.2009
  • 2. Introduction
    • SOAP is the protocol used most often in web services communications
    • WSRF uses SOAP as a communications protocol
    SOAP Comparison to WSRF 28.11.2009
    • Today: Performance analysis’ for SOAP toolkits have been performed for various cases and toolkits But: WSRF implementations have generally not been taken into account
  • 3. Introduction
    • SOAP performance of three WSRF implementations are compared:
      • UNICORE 6 WSRFLite 1.8.6,
      • Globus Toolkit 4 Java WS-Core 4.2.1
      • Apache Muse v2.2.0.
    • Benchmark results can indicate which implementation is favorable if performance is a key requirement
    SOAP Comparison to WSRF 28.11.2009
  • 4. Related Work
    • Investigation of applicability of SOAP in Real-Time Trading Systems [5]
    • Analysis of the feasibility of SOAP for Scientific Computing [1]
    • Test of specific SOAP toolkits against Axis 1, gSoap, bSoap and XSUL in a generic SOAP benchmark suite [2]
    • Investigation of WSRF specific operations for Globus Toolkit v3.9.2 but without deeper conclusions [7]
    • Analyzis of s uitability of SOAP for wireless devices [4]
    SOAP Comparison to WSRF 28.11.2009
  • 5. Related Work
    • This work is based on:
      • benchmark suite developed by Head et. Al [2]
    • Selected benchmark suite because it was performed with the aim of developing a standard benchmark suite for:
      • Cuantifying,
      • comparing and
      • Contrasting
    • Wide range of use cases
    SOAP Comparison to WSRF 28.11.2009 the performance of SOAP implementations
  • 6. Methodology: Software
    • For each middleware one service is developed
    • Each service exposes 3 different types of operations:
      • Echo: received values are sent back
      • Receive: the number of values is sent back
      • Send: for a received number, that much values are sent back
    • All operations are implemented for primitive data types:
      • Byte, double, int and string
    SOAP Comparison to WSRF 28.11.2009
  • 7. Methodology: Software
    • Additionally two complex data types are used
      • MeshInterfaceObject: consists of two integers that represent coordinates and a double that represents a field value at the given position.
      • SimpleEvent: an object representing an event that is composed of a sequence number (int), a time stamp (double) and a message (String)
    • Operation echoVoid ( void input and output) is implemented to test latency of the SOAP stack
    SOAP Comparison to WSRF 28.11.2009
  • 8. Methodology: Hardware
    • Server: Dell Latitude D620 with Intel® Core 2 Duo™ CPU T7400 2.17 GHz and 2 GB of memory
    • Services benchmarked in:
      • Ubuntu Linux v9.04 (Kernel 2.6.28-11-generic)
      • Windows Vista Enterprise 32-bit Service Pack 1
    • Client: Dell Optiplex 320 with Intel® Pentium® D CPU 3.00 GHz and 2 GB of memory
      • Windows Vista Enterprise 32-bit Service Pack 1
    SOAP Comparison to WSRF 28.11.2009
  • 9. Results: Latency
    • Performed by calling void operation:
      • operation needs no processing except the one inherent in every SOAP message processing
      • good indicator of the overhead imposed by the different toolkits
    SOAP Comparison to WSRF 28.11.2009 Windows Linux GT4 6 ms 42 ms Muse 4 ms 6 ms WSRFLite 2 ms 3 ms
  • 10. Results: Latency
    • Ranking by measuring the imposed overhead:
      • WSRFLite
      • Muse
      • GT4
    SOAP Comparison to WSRF 28.11.2009
    • General trend: all toolkits run faster under Windows
    • Recent test of different VMs on Ubuntu Linux and Windows Vista showed opposite results [6]
    • T rend that performance is slower under
    • Linux stays the same for the other tests as well
  • 11. Results: Serialization
    • Serialization performance was tested with the send* operations:
      • integer specifying array size to be created by service is sent over wire
      • only input parameter: an array of the corresponding size is then returned.
    SOAP Comparison to WSRF 28.11.2009
  • 12. Results: Serialization SOAP Comparison to WSRF 28.11.2009
  • 13. Results: Serialization SOAP Comparison to WSRF 28.11.2009
  • 14. Results: Serialization
    • Muse performs worst on both platforms
    • GT4 has the best results when dealing with complex objects
    • Otherwise best perfomance: WSRFLite.
    SOAP Comparison to WSRF 28.11.2009
    • Ranking:
      • WSRFLite (without complex objects)
      • GT 4
      • Muse
  • 15. Results: Deserialization
    • Deserialization performance was tested with the receive* operations:
      • array of objects was sent to service
      • size of the array was returned as integer
    SOAP Comparison to WSRF 28.11.2009
  • 16. Results: Deserialization SOAP Comparison to WSRF 28.11.2009
  • 17. Results: Deserialization SOAP Comparison to WSRF 28.11.2009
  • 18. Results: Deserialization
    • Performance ranking under Windows:
      • WSRFLite
      • GT4
      • Muse
    • Performance difference at receiveBase64:
      • WSRFLite
      • Muse and GT4
    • No differences under Linux
    SOAP Comparison to WSRF 28.11.2009
  • 19. Results: End-to-End
    • End-to-End performance was tested with echo* operations
      • each service returns given input array
      • input array incorporates both complex deserialization (when receiving) and serialization (when sending) operations
    SOAP Comparison to WSRF 28.11.2009
  • 20. Results: End-to-End SOAP Comparison to WSRF 28.11.2009
  • 21. Results: End-to-End SOAP Comparison to WSRF 28.11.2009
  • 22. Results: End-to-End
    • Ranking under Windows and Linux:
      • WSRFLite
      • GT4
      • Muse
    • echoBase64 operation:
      • WSRFLite is still performing much better
      • GT4 and Muse nearly the same results
    SOAP Comparison to WSRF 28.11.2009
  • 23. Conclusions
    • Future work:
      • investigate the performance of the three toolkits when making use of capabilities of WSRF and other implemented specifications
      • Investigation of methods such as:
      • creating and destroying resources, getting and setting properties or sending and receiving notifications.
    28.11.2009 SOAP Comparison to WSRF
  • 24.
    • Thank You!
    SOAP Comparison to WSRF 28.11.2009
  • 25. References
    • [1] K. Chiu, M. Govindaraju, and R. Bramley.
    • Investigating the limits of soap performance for
    • scientific computing. In HPDC ’02: Proceedings of the
    • 11th IEEE International Symposium on High
    • Performance Distributed Computing, page 246,
    • Washington, DC, USA, 2002. IEEE Computer Society.
    • [2] M. R. Head, M. Govindaraju, A. Slominski, P. Liu,
    • N. Abu-Ghazaleh, R. van Engelen, K. Chiu, and M. J.
    • Lewis. A benchmark suite for soap-based
    • communication in grid web services. In SC ’05:
    • Proceedings of the 2005 ACM/IEEE conference on
    • Supercomputing, page 19, Washington, DC, USA, 2005.
    • IEEE Computer Society.
    • [3] F. Ilinca, J.-F. Hetu, M. Audet, and R. Bramley.
    • Simulation of 3-d mold-filling and solidification
    • processes on distributed memory parallel architectures.
    SOAP Comparison to WSRF 28.11.2009
  • 26. References
    • [4] J. Kangasharju, S. Tarkoma, and K. Raatikainen.
    • Comparing soap performance for various encodings,
    • protocols, and connections. In Personal Wireless
    • Communications, volume 2775 of Lecture Notes in
    • Computer Science, pages 397–406. Springer-Verlag,
    • 2003.
    • [5] C. Kohlhoff and R. Steele. Evaluating soap for high
    • performance business applications: Real-time trading
    • systems, 2003.
    • [6] M. Larabel. Java performance: Ubuntu linux vs.
    • windows vista. http://www.phoronix.com/scan.php?
    • page=article&item=java_vm_performance&num=1.
    • [7] M. Li, M. Qi, M. Rozati, and B. Yu. A WSRF based
    • shopping cart system. In P. M. A. Sloot, A. G.
    • Hoekstra, T. Priol, A. Reinefeld, and M. Bubak, editors,
    • EGC, volume 3470 of Lecture Notes in Computer Science,
    • pages 993–1001. Springer, 2005.
    28.11.2009 SOAP Comparison to WSRF