BIOMASS_E2ES_IGARSS2011.ppt

515 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
515
On SlideShare
0
From Embeds
0
Number of Embeds
29
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

BIOMASS_E2ES_IGARSS2011.ppt

  1. 1. BIOMASS End-to-End Mission Performance Simulator Paco López-Dekker , Francesco De Zan, Thomas Börner, Marwan Younis, Kostas Papathanassiou (DLR); Tomás Guardabrazo (DEIMOS); Valerie Bourlon, Sophie Ramongassie, Nicolas Taveneau (TAS-F); Lars Ulander, Daniel Murdin (FOI); Neil Rogers, Shaun Quegan (U. Sheffiled) and Raffaella Franco (ESA) Microwaves and Radar Institute, German Aerospace Center (DLR)
  2. 2. Project Context and Objectives <ul><li>BEES: BIOMASS End-to-End (mission performance) Simulator </li></ul><ul><li>ESA funded project in context of BIOMASS EE-7 Phase-A study </li></ul><ul><li>Provide a tool to evaluate the expected End-to-End performance of the mission </li></ul><ul><ul><li>Realistic, distributed scenes </li></ul></ul><ul><ul><li>Model system residual errors (noise, ambiguities, instrument stability, channel unbalances…) </li></ul></ul><ul><ul><li>Ionospheric disturbances (Faraday rotation and scintillation) </li></ul></ul><ul><ul><li>Processing </li></ul></ul><ul><ul><ul><li>L0, L1, L1b </li></ul></ul></ul><ul><ul><ul><li>Ionospheric error correction </li></ul></ul></ul><ul><ul><ul><li>L2 retrieval </li></ul></ul></ul><ul><li>Focus on including all main effects and disturbances </li></ul><ul><ul><li>Not detailed instrument simulator </li></ul></ul>
  3. 3. BEES Overview
  4. 5. BEES Modules <ul><li>“ Engineering” Modules </li></ul><ul><ul><li>Geometry Module: provides common geometry to all modules [DEIMOS] </li></ul></ul><ul><ul><li>Observing System Simulator (OSS-A & OSS-B) [A: DLR; B: Thales Alenia Space] </li></ul></ul><ul><ul><li>Product Generation Module(s) [DLR] </li></ul></ul><ul><ul><ul><li>PGM-L1a </li></ul></ul></ul><ul><ul><ul><li>PGM-L1b </li></ul></ul></ul><ul><li>“ Scientific” Modules </li></ul><ul><ul><li>Scene Generation Module (SGM) [DLR+U. Chalmers] </li></ul></ul><ul><ul><li>Ionospheric Modules [U. of Sheffield] </li></ul></ul><ul><ul><ul><li>Ionospheric Generation Module (IGM) </li></ul></ul></ul><ul><ul><ul><li>Ionospheric Correction Module (ICM) </li></ul></ul></ul><ul><ul><li>Level-2 retrieval module (L2RM) [FOI] </li></ul></ul><ul><li>Performance evaluation modules [DLR] </li></ul><ul><ul><li>PEM-L1b </li></ul></ul><ul><ul><li>PEM-L2 </li></ul></ul>
  5. 6. BEES Block Diagram OpenSF Simulation control <ul><li>OpenSF drives the E2ES. This includes: </li></ul><ul><li>UI </li></ul><ul><li>Execution Monte Carlo runs. </li></ul><ul><li>Etc… </li></ul>
  6. 7. BEES diagram: OSS <ul><li>3 sub-modules </li></ul><ul><ul><li>Dummy Radar Parameter Generator (RPG) </li></ul></ul><ul><ul><li>System Errors and Sensititvity Module (SES) </li></ul></ul><ul><ul><li>Impulse Response Function Module </li></ul></ul><ul><li>IRF strategy </li></ul><ul><ul><li>IRF models SAR system + processing </li></ul></ul><ul><ul><li>This avoids generation of RAW data </li></ul></ul><ul><li>SES strategy: model residual errors </li></ul><ul><li>Two OSS versions corresponding to the two industry Phase-A studies. </li></ul>
  7. 8. SGM: Scene Definition 200t/ha, Clark-Evans Index 1.8 300t/ha, Clark-Evans Index 0.8 <ul><li>Forest Type (Out of a Predefined List); </li></ul><ul><li>Mean Biomass Level (Ha level); </li></ul>Spatial Distribution of “single” trees each with a individual (top) Height / Biomass tag. 500t/ha 0t/ha <ul><li>100x100 m: </li></ul><ul><li>Biomass (t/ha) </li></ul><ul><li>Tree height (h 100 ) </li></ul>To forward model
  8. 9. SGM output (ground truth) Biomass Tree height (H 100 )
  9. 10. Input to PGM: PolInSAR covariance matrices σ HH σ HV σ VV
  10. 11. Input to PGM: PolInSAR covariance matrices ρ HH1-HH2 ρ HV1-HV2 ρ VV1-VV2
  11. 12. BEES Block Diagram: PGM
  12. 13. Review of PGM algorithm Generation of interferometric/polarimetric channels for the scatter (correlated) and the noise (uncorrelated) Spectral shift modulation (geometric decorrelation part I) 2-D convolution Add ionospheric phase screen (scintillations) and Faraday rotation Spectral shift demodulation (geometric decorrelation part II) Ambiguity stacking Additional system disturbances (cross-talk, phase and gain drifts…) L1b product generation (multilooking) L1a product generation SGM, OSS GM OSS GM IM, GM OSS OSS ICM GM inputs macro steps
  13. 14. Multichannel signal simulation Channel Linear Combination channel #1 channel #2 channel #N channel #1 channel #2 channel #N channel #1 channel #2 channel #N Independent channels (complex) Correlated channels (complex) Spatial convolutions Desired spectral properties for each complex channel Tree Height Coherence – HH-HH SLC – HH
  14. 15. Introduction of Ionospheric distorion Orbit Target 1 Aperture angle: This is what really matters! Lower (virtual) orbit Equivalent Aperture Target 2 Ionosphere (modeled as a layer) This part of the ionosphere Modifies this part of the raw data for Target 1 … but this part for Target 2 Ionospheric distortion cannot be applied directly to raw data!!! (the raw data distortion is target dependent) For an orbit at Ionosphere height Distortions can be applied directly to the raw data Aperture length
  15. 16. BEES Block Diagram <ul><li>This block applies the ionospheric correction (Faraday rotation and shifts only). </li></ul>The simulation of the Ionosphere is divided in two steps. First the spectral coefficients describing the state of the Ionosphere are generated. For a given spectra random realizations are generated.
  16. 17. Level-2 Retrieval Discussed in previous talk!
  17. 18. L2 retrieved heights (H 100 ) SGM L2 Range dependent H 100 bias Software bug or realistic feature?
  18. 19. L2 retrieved biomass SGM L2
  19. 20. Performance Evaluation (L1b) <ul><li>L1b performance in terms of element-wise covariance matrix errors </li></ul><ul><ul><li>Bias </li></ul></ul><ul><ul><li>Standard deviation </li></ul></ul><ul><li>In example </li></ul><ul><ul><li>Significant coherence loss, due to spectral shift </li></ul></ul>
  20. 21. Performance Evaluation (L2) <ul><li>L2 performance in terms of biomass and tree height errors </li></ul><ul><ul><li>Bias </li></ul></ul><ul><ul><li>Standard deviation </li></ul></ul><ul><li>Error statistics vs. range and biomass levels </li></ul><ul><li>In example </li></ul><ul><ul><li>Height error leads to biomass error? </li></ul></ul>
  21. 22. Performance Evaluation (L2) <ul><li>L2 performance in terms of biomass and tree height errors </li></ul><ul><ul><li>Bias </li></ul></ul><ul><ul><li>Standard deviation </li></ul></ul><ul><li>Error statistics vs. range and biomass levels </li></ul><ul><li>In example </li></ul><ul><ul><li>Height error leads to biomass error? </li></ul></ul>
  22. 23. Monte Carlo (multiple runs of BEES) <ul><li>Monte Carlo simulations are implemented by OpenSF </li></ul><ul><ul><li>BEES is run repeatedly perturbing (if necessary) some input parameters. </li></ul></ul><ul><li>Perturbation approach </li></ul><ul><ul><li>Random realizations implemented by modules (OpenSF can provide varying seed for independent realizations). </li></ul></ul><ul><ul><li>This gives the control of the randomization to the module developers in order to ensure physical correctness. </li></ul></ul><ul><ul><li>Most of this randomness is introduced by IGM and PGM </li></ul></ul>
  23. 24. Notes on Validation
  24. 25. Validation: challenges and strategy <ul><li>BEES is a complex software tool comprising modules developed by different teams under heterogeneous environments </li></ul><ul><li>How do we know that the outputs are correct? </li></ul><ul><ul><li>We are developing the tool because we do not know (exactly) what we will get! </li></ul></ul><ul><ul><li>We are simulating a random process: </li></ul></ul><ul><ul><ul><li>Speckle </li></ul></ul></ul><ul><ul><ul><li>Random noise </li></ul></ul></ul><ul><ul><ul><li>Random hardware disturbances </li></ul></ul></ul><ul><ul><ul><li>Random realizations of Ionosphere </li></ul></ul></ul><ul><ul><ul><li>… </li></ul></ul></ul><ul><li>Validating the software requires approaches that resemble the post-launch validation/calibration of a real system </li></ul><ul><ul><li>Homogeneous scenes </li></ul></ul><ul><ul><li>Point targets </li></ul></ul><ul><li>Validation needs to check if resulting statistics for some canonic cases are in agreement with theory. </li></ul>
  25. 26. Example: NESZ validation <ul><li>NESZ is range dependent </li></ul>The threshold is designed for a failure probability of 10 -3 test failure test success test failure The nominal NESZ value
  26. 27. Example: PGM L1b Verification Probabilistic Threshold <ul><li>Due to random nature of speckle, the estimated covariance matrices will not be identical to the true one (even when all error sources are turned off) </li></ul><ul><li>We can however evaluate the likelihood of a certain output given the input in probabilistic terms (e.g. using confidence intervals). </li></ul><ul><li>We will do the test using the complex coherences, i.e. the normalized elements of the sample covariance matrix: </li></ul><ul><li>Using a probability threshold ( th ), it is possible to bind the deviation: </li></ul><ul><li>The threshold will be a function of the desired error ( t ), the input coherence ( γ ) and the number of looks ( L). </li></ul>
  27. 28. PGM L1b Verification – Caveat! <ul><li>The assumption that the estimate is unbiased doesn’t hold for high coherences and low number of looks. </li></ul><ul><li>For a given coherence one has to make sure that enough looks are taken into account, i.e.: </li></ul>10 5 simulations, gamma=0.5, L=250 10 5 simulations, gamma=0.95, L=30 histograms from simulations To validate the simulator we need (to simulate) large, homogeneous scenes! Sound familiar?
  28. 29. Project Status/Outlook <ul><li>Software almost completed </li></ul><ul><ul><li>Full handling of ambiguities missing </li></ul></ul><ul><ul><li>Some ionospheric features/possibilities pending </li></ul></ul><ul><li>Validation and debugging on-going </li></ul><ul><ul><li>Distinguishing between bugs and features not easy! </li></ul></ul><ul><li>Mission Performance Assessment </li></ul><ul><ul><li>Once BEES is validated it will be used to assess mission performance for both Phase-A designs </li></ul></ul><ul><ul><li>Hundreds of test cases requiring “N” Monte Carlo repetitions </li></ul></ul><ul><ul><li>Weeks of simulation time </li></ul></ul>

×