Kliment ppt gi2011_testing_remote_final

821 views

Published on

GI2011-X-border-SDI/GDI-Symposium, 23..5.2011: Bad Schandau (SAX)
[ 11. Sächsisches GIS-Forum ]
24.5.2011: Decin (CZE)
[ 1. Bohemian#Saxonian GIS-Forum ]

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
821
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Kliment ppt gi2011_testing_remote_final

  1. 1. TESTING OF SDI COMPONENTS – A FUNDAMENTAL INTEROPERABILITYELEMENT WITHIN INSPIRE AND NATIONAL SDI’S T. Kliment, M. Tuchyňa, M. Kliment This presentation has been transmitted remotely as TELE lecture from Pallanza (IT) thanks to “BizBiz-Tool”
  2. 2. Presentation outlines• Introduction• Basic description of involved organizations within testing o Slovak Environmental Agency (SEA) o Slovak University of Technology (SUT) o Slovak University of Agriculture (SUA) o Proposed tasks related to SDI components testing• SDI component to be provided by SEA to INSPIRE&NSDI• Testing methodology• Testing environment proposal• Testing tools• Pilot testing of the SEA SDI components• Testing results summary• Conclusions and future work
  3. 3. Introduction• European and National SDI should provide data discovery, evaluation and consumption via central point – Geoportal• Network services are engines to serve data and metadata from repository to central SDI Geoportal• Network service behaviour is defined by standards& specifications (ISO, OGC) and specified by regulations (INSPIRE)• Before the NS connection to geoportal testing has to be against above requirements has to be done to ensure interoperability• Voluntary collaboration established under the auspices of SDI PTB for such purposes• Similar activities are nowadays problematic due to missing such experience in mandated organization• Therefore such type of collaboration should increase among all organizations involved in SDI establishment
  4. 4. Basic description of involved organization• Slovak Environmental Agency o Public sector body o Coordinating SDI development within environmental domain o Contributing to national and international SDI development implementation via:  Metadata system for spatial and non spatial data  Network services (discovery, view, download, transformation, spatial data services  Consolidated spatial data repository (Central Geografical System)  Client applications (Environmental geoportal, domain specific web map clients) o Providing technical support for INSPIRE transposition and NSDI implementation o LMO & expert representation in IOC TF within INSPIRE
  5. 5. Basic description of involved organization• Slovak University of Technology - Geodetic departments o Academic sector organization o Has a study program geoinformatics in master degree level o Has defended 2 PhD thesis and has 4 in processing phase that are related to GIS/SDI domain - data quality, data modelling, web services, GIS usage in specific domains (archaeology, floods, geodesy) o Performs education and research within GIS/SDI o Implements web map clients, web map services and testing tools o Performed initial testing of discovery and view services against INSPIRE requirements  presented at conferences (GIS Ostrava, EnviroIForum, GI2010)  SDIC & expert representation in IOC TF within INSPIRE
  6. 6. Basic description of involved organization• Slovak University of Agriculture – Dept. of Landscape Planning and Ground Design o Academic sector organization o Has a study program Ground design and GIS in master degree level o Dept. is a producer of huge amount of data about land as a results from ground design projects o Currently the pilot project is related to design and implementation of IS about hydro physical properties of soils in Slovakia based on SDI principles: o Metadata and discovery services o View and download services o Processing services for spatial analyses
  7. 7. SDI components provided by SEA to INSPIRE/NSDI• SEA covers data themes from all 3 INSPIRE annexes • Annex I – Hydrography, Protected sites • Annex II – Land Cover • Annex III – Biogeo regions, Habitates&Biotopes, Species distribution Network service Annex I Annex II Annex III type YES YES Discovery service (terra catalog (terra catalog NOT YET CSW 2.0.2) CSW 2.0.2) YES YES YES View service (ArcGIS Server (ArcGIS Server (ArcGIS Server WMS 1.3) WMS 1.3) WMS 1.3) YES YES YES Download service (ArcGIS Server (ArcGIS Server (ArcGIS Server WFS 1.1) WFS 1.1) WFS 1.1)
  8. 8. Metadata for datasets&services served by discovery service (CSW) Maps served by view service (WMS) Data served by download service (WFS)
  9. 9. Testing methodologyTesting Coverage INSPIRE, ISO, OGC Service interface/ Testing scope quality/contentTesting temporal Short/long term extent Conceptual/applicationTesting scenarios design, testing tool Testing Complex/partial performance testing modelTesting reporting Report template
  10. 10. Testing environment proposals act SUTSEAT_testing testingEnv ironment publishNetw orkServ ice SetupTestingScenarios SelectTestingScenario «invokes» deployTests SDI components Communicate tester SDI components prov ider «invokes» reportResults «invokes» displayResults sendReport dow nloadReport•Users – SDI component tester, SDI component provider•Use cases – PublishNS, Define/ConfigureTestingScenario,SelectTestingScenario, PerformTest, ReportResults, ViewResultsSendResults, SaveResults, Communicate,
  11. 11. Testing tools• Webtest tool - web application for web service testing o Developed by testers at SUT with Java and JSP technologies o Web GUI o Single, multiple, simultaneous GET and POST requests o Measures times:  Between RQ and the first byte of the RS download  Between RQ and the last byte of the RS download o Counts presence of predefined string o Configuration based on XML files  Service endpoint definition  Testing scenario definition (request to service) o Provides results in the tabular form o Does not provide statistics for long term testing yet o Online accessible on: http://geo.vm.stuba.sk:8080/webtest/
  12. 12. Testing tools - webtest testcase.xml results
  13. 13. Testing tools• MDValidator - desktop application for batch metadata validation o Developed by testers at SUT as Java application o Desktop GUI o Performs batch validation of MD as XML files from local dir o Invokes online REST Web INSPIRE validator service o Provides results in XML/HTML form o Does not validate against ISO gmd schema yet o Useful after batch MD transformation (xslt) `
  14. 14. Testing realization - discovery service• Testing coverage - INSPIRE requirements o Search criteria - 25 queryables o Operations - 4 operations with predefined parameters o Quality of Service - 3 parameters• Testing scope o Tested 3 operations (DiscoverMD, GetDSMD, PublishMD) and 23 queryables o Estimated 2 quality parameters (Performance, Capacity), MD• Temporal extent o Short term testing - all predefined testing scenarios launched few times• Testing scenarios + tools o One for queryables+performance+DiscoverMD operation o One for capacity as combination of operations o One for Publish MD operation+performance o webtest+MDValidator
  15. 15. Testing realization - view services• Testing coverage - INSPIRE requirements o Operations - 3 operations with predefined parameters o Quality of Service - 3 parameters• Testing extent o Tested 2 operations (GetMap, GetVSMD) o Estimated 2 quality parameters (Performance, Capacity)• Temporal extent o Short term testing - all predefined testing scenarios launched once• Testing scenarios + tool o One for GetMap operation+performance estimation o One for capacity as combination of operations o webtest o
  16. 16. Testing realization - download services• Testing coverage - INSPIRE requirements o Operations - 4 operations + 2 more for spatial object access o Quality of Service - 3 parameters• Testing extent o Tested 3 operations (GetSDS, GetDSMD, DescribeSDS) o Tested 2 parameters of QoS• Temporal extent o Short term testing - all predefined testing scenarios launched once• Testing scenarios + tool o One for GetSpatialDataSet+performance&capacity estimation o One for DescribeSpatialDS and GetDSMD+performance o webtest
  17. 17. Testing results summary Report template structure proposals 1/2 Discovery View Download Tested service service service service Interface Results(operations&paramters)GetDSMetadata Supported Supported Supported Parameters no tested Parameters no tested Parameters no testedDiscoverMetadata Supported Parameters no testedPublishMetadata Supported Parameters no testedLinkService No tested No tested No testedGetMap Supported Parameters no testedGetSpatialDataSet Supported Parameters no testedDescribeSpatiaDataSet Supported Parameters no testedGetSpatialObject No testedDescribeSpatialObject No tested
  18. 18. Testing results summary Report template structure proposals 2/2 Discovery View Download Tested service service service service Quality of service ResultsPerformance Satisfied Satisfied Satisfied on 90% (30/10 requests sent (115 requests sent (30 requests sent 27 30/10 responses < 115 responses < 3s) responses < 5s) 10/30s)Capacity Satisfied Satisfied on 70% Satisfied (30 simultaneous (20 simultaneous (10 simultaneous requests sent 30 requests sent 14 requests sent 10 responses < 3s) responses < 5s) responses < 30s)Availability No tested No tested No tested Other criteria ResultsSearch criteria Supported 23/25 criteria testedSearch Criteria for theGet Spatial Object No testedOperation Content of the service ResultsMetadata models No testedData models No tested
  19. 19. Conclusions• Initial short-term testing has brought first and quite positive results• Collaboration between public and academic sector is important• Testing methodology needs high knowledge and understanding of all requirements defined in related regulations• Testing scenarios preparation is important and consumes much time• Testing results should be easily interpreted and understandable• Where possible and appropriate various levels of compliancy can be introduced
  20. 20. Future work• Discussions, proposals, suggestions on Testing methodology• Testing scenarios extensions o For all required operations (Link NS) o RQ + RS parameters validation against INSPIRE specific constraints (e.g. NS INSPIRE MD within GetNSMD response, Language parameter, Layer MD, Spatial Dataset MD) o Long term testing scenarios for accurate service quality parameters estimation o documentation of individual scenarios (conceptual level)• Testing report template o Discussions and decisions on the form and content of the reports - (e.g. tabular form with information as Date, Test description, Test execution, Results, Pass/Fail definition, comments ....)
  21. 21. Future work• Testing environment extensions o Functions for reporting exports, statistics calculations, plots o Testing series implementation (e.g. INSPIRE discovery service testing scenario, ...) o Results storage in form of database to avoid loss of results in long term• Testing of local spatial data and services compliancy against INSPIRE Annex II+III data specifications o Feasibility testing (Annex III) o Fitness for purpose testing• Efforts to promote testing & validation (awareness rising)• Efforts to formalise testing & validation (proposal for establishemnt of common testing platform allowing sharing tools, materials, methodologies, experience and expertice related to SDI components testing)
  22. 22. Thank you very much! Contact info:SUT in Bratislava, Dept. of Theoretical GeodesySEA in Banská Bystrica, Dept. of Environmental informaticsSUA in Nitra, Dept. of Landscape Planning and Ground Design tomas.kliment@gmail.com martin.tuchyna@gmail.com marcel.kliment@uniag.sk

×