Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation

720 views

Published on

Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation
Erik Borg, Bernd Fichtelmann - German Aerospace Center, German Remote Sensing Data Center
Hartmut Asche - Department of Geography, University of Potsdam

Published in: Technology, Business
  • Be the first to comment

Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation

  1. 1. <ul><li>Improvement of </li></ul><ul><li>spatial data quality through data conflation </li></ul><ul><li>Silvija Stankute, Hartmut Asche </li></ul><ul><li>Geoinformation Research Group </li></ul><ul><li>Dept of Geography | University of Potsdam | Germany </li></ul>ICCSA 2011 | GEOG-AN-MOD 2011 | University of Santander | 20-23/06/2011 Data usability assessment for remote sensing data: Accuracy of interactive data quality interpretation Erik Borg, Bernd Fichtelmann, Hartmut Asche DFD | German Aerospace Centre | Neustrelitz Dept of Geography | University of Potsdam | Germany ICCSA 2011 | GEOG-AN-MOD 2011 | University of Santander | 20-23/06/2011
  2. 2. <ul><li>Increasing availability of remote sensing (RS) data is complica-ting orientation in data bases of remote sensing data </li></ul><ul><li>To facilitate orientation in RS data bases data providers make available additional data information, such as geographic loca-tion, acquisition time, data quality </li></ul><ul><li>To assess RS data quality cloud cover degree is a frequently used quality parameter which records data quality insufficiently </li></ul><ul><li>ESA has defined a new quality measure data usability which is interactively interpreted by operators based on: </li></ul><ul><li>Technical data errors : lost image lines/segments, scan mirror anomalies, </li></ul><ul><li>Unusable image segment : Clouds, haze, shadow, derivation within a scene </li></ul>Motivation
  3. 3. Search in databases Data acquisition <ul><li>DESCW </li></ul><ul><li>Retrieval system: Display Earth Remote Sensing Swath Coverage </li></ul><ul><li>Data selection criteria of LANDSAT 7 / ETM+ data include </li></ul>Mission Orbit Track Frame Data Time Data quality Station Coordinates .. .. SCENE_LL_CORNER_LON = 25.4800 SCENE_LR_CORNER_LAT = 43.6500 SCENE_LR_CORNER_LON = 27.8100 HORIZONTAL_DISPLAY_SHIFT = 0 SCENE_CCA = 50 UL_QUAD_CCA = 60 UR_QUAD_CCA = 60 LL_QUAD_CCA = 30 LR_QUAD_CCA = 50 SUN_AZIMUTH_ANGLE = 138.5 SUN_ELEVATION_ANGLE = 59.8 CCA = Cloud Cover Assessment QUAD = quadrant UL = upper left UR = upper right LL = lower left LR = lower right
  4. 4. Processing chain National ground segment Landsat 7 Monitor & Control, MWD WS (SGI O2 with 128MB) 18 GB disk 2 x DLT 7000 EXABYTE CD Master Labelling system 3 x 18GBdisk Catalogue, OH Workstation (O2 with 128MB) SWITCH MATRIX (EMP) SGI Origin 200 server 4CPUs 512 MB plus GigaChannel PCI ingestion boards (Ciel) 5 x 18 GB disk array internal SCSI Controller 2 x DLT 7000 18 GB disk SGI O2 Station Data Server EXABYTE CD Master Labelling system Label Printer Reports Printer Fast Ethernet Label Printer Reports Printer R S 2 3 2 R S 2 3 2 Front-End Handler (PC - RS232/IEEE488) Demod 1 (Alcatel) Exabyte CD-ROM DLT 7000 c o n t r o l l e r 4 x S C S I SGI Origin 200 server 4CPUs 512 MB plus GigaChannel PCI ingestion boards (Ciel) 5 x 18 GB disk array internal SCSI Controller c o n t r o l l e r 4 x S C S I Quality Control Workstation (O2 with 128MB) I Q I Q I Q Monitor & Control, MWD WS (SGI O2 with 128MB) External lines Demod 2 (Alcatel) I Q <ul><li>Objectives </li></ul><ul><li>Decoding and data syncronisation </li></ul><ul><li>De-communitation </li></ul><ul><li>Production of browse data </li></ul><ul><li>Status information </li></ul><ul><li>Storage of raw data </li></ul>Schematic representation of Landsat ground segment (modified from Beruti 2002) Red: interactive data usability assessment STOP Interactive Data Usability Estimation
  5. 5. Derivation of test data Receiving circle of Neustrelitz ground station Spatial distribution of Landsat 7/ETM+ database
  6. 6. Definitions of data quality <ul><li>Assessment of cloud cover degreee </li></ul><ul><li>Detection and identification of cloud covered pixel </li></ul><ul><li>Ration of number of cloud covered pixel to total number of pixels of assessment unit </li></ul><ul><li>Data usability assessment </li></ul><ul><li>Not usable image segment: detection of clouds and cloud shadows, distribution and configuration within scene </li></ul><ul><li>Technically induced image errors: lost lines and sectors, scan mirror anomalies </li></ul>Cloud No Cloud 25 % Cloudiness
  7. 7. Quick-look data for quality assessment <ul><li>Quality assessment by interpreters </li></ul><ul><li>4 quadrants </li></ul><ul><li>Real colour coded data </li></ul>
  8. 8. Equations Assessment of evaluation subjectivity  Acceptable error 0.05 (significance level of 5 %) z-value Standard normal distribution 1.96, P Population 11,828 quadrants, P(1-P) Maximum value 0.25 n ≥ 384 400 d i Absolute error SD DU Standard deviation
  9. 9. Equations Mean error vs single interpreter | metadata <ul><li>Modulus of absolute error of mean data usability with regard to single interpreter assessment </li></ul><ul><li>Modulus of absolute error of mean data usability with regard to metadata assessment </li></ul>
  10. 10. Mean data usability vs standard deviation <ul><li>Interpreter assessment plotted vs. data usability class standard deviation </li></ul><ul><li>Dashed line: trend </li></ul>
  11. 11. Mean data usability vs data usability within metadata <ul><li>Comparison of mean data usability and data usability (metadata) for 400 quadrants </li></ul><ul><li>Dotted line is 1:1 line, solid line is regression line </li></ul>
  12. 12. Single interpretation assessment vs Standard deviation <ul><li>Evaluation by interpreters plotted against standard deviation </li></ul><ul><li>Results of each individual interpreter marked by separate symbol and colour </li></ul>
  13. 13. Operational aspects Extended quick-look product with cloud mask, operator vote and automaton vote
  14. 14. Extended quick-look product for monitoring the actual processing status Operational aspects
  15. 15. Thank you for your attention! Questions, comments, feedback? [email_address] [email_address] ICCSA 2011 | GEOG-AN-MOD 2011 | University of Santander | 20-23/06/2011

×