The document proposes a three-component decomposition approach for polarimetric SAR data that improves stability over the Freeman decomposition. It addresses the ill-posed nature of the Freeman decomposition by using Tikhonov regularization, which introduces a regularization parameter to make the problem well-posed. Experimental results show the proposed approach produces more stable decompositions with no negative powers compared to the standard Freeman decomposition.
Kendall's Tau is a nonparametric correlation test used with ordinal or ranked data, like ranks in a competition. It measures the relationship between rank orders of two variables. Researchers analyzed rank orders of athletes in biking and running events of an ironman competition using Kendall's Tau due to ties in ranks. Kendall's Tau results range from -1 to 1, with values closer to the extremes indicating a stronger monotonic relationship between variables.
WE3.L09 - EVALUATION OF SYSTEM POLARIZATION QUALITY FOR POLARIMETRIC SAR IMAG...grssieee
This document evaluates polarization quality metrics for polarimetric synthetic aperture radar (PolSAR) imagery and target decomposition. It introduces the maximum normalized error (MNE) metric to quantify the effects of polarization distortions from the radar system. The MNE captures the maximum error over all possible target polarimetric responses. It provides an evaluation baseline for comparing the errors of different polarimetric decompositions under various distortion levels and configurations. Simulation results demonstrate that the MNE metric connects PolSAR image quality to interacting polarization distortions and serves as a reference for acceptable distortion levels.
TU1.L09.1 - APPLICATION OF POLARIMETRIC SAR TO EARTH REMOTE SENSINGgrssieee
Polarimetric SAR data allows for a more complete analysis of scattering mechanisms. There are various techniques for analyzing polarimetric data, including polarization signatures and eigenvector decompositions. Model-based decomposition compares observed covariance matrices to model-derived matrices to identify the dominant scattering mechanisms, such as surface, volume, or double-bounce scattering, on a pixel-by-pixel basis. This provides quantitative information about surface properties and distributions of scatterers within each resolution cell.
This document proposes a new polarimetric calibration method for SAR using forest and surface scattering targets. It models forest backscatter as consisting of volume, double bounce, and surface scattering. Two steps are used to determine channel imbalances and forest parameters: first using forest and a corner reflector, then estimating cross-talks. The method was evaluated using 26 Amazon datasets and showed stable parameters and good polarimetric signatures after calibration. Reflection symmetry was confirmed for forests.
This document discusses randomly changing radio frequency interference (RFI) detected in ALOS PALSAR data collected over the American Arctic. It presents an analysis of RFI signatures identified in the data and the development of an azimuth analysis-based notch filtering algorithm to mitigate the effects of the interference. The algorithm detects and removes RFI based on outliers identified in azimuth cuts through the range-frequency data. Results show the method successfully restores data quality and polarimetric signatures degraded by RFI from long-range radar stations. The document concludes growing RFI is an issue for microwave remote sensing and requires ongoing monitoring and mitigation efforts.
Model-based Polarimetric Decomposition using PolInSAR Coherence_v11(FILEminim...grssieee
This document presents a new model-based polarimetric decomposition technique that utilizes polarimetric SAR interferometry (PolInSAR) coherence. Current decomposition models have limitations in accurately modeling volume scattering and resolving scattering mechanism ambiguities. The proposed method introduces an adaptive volume scattering model parameterized by PolInSAR coherence. This allows the volume scattering to better fit diverse terrain types like forests and built-up areas. Experimental results on L-band PolInSAR data demonstrate the method's ability to discriminate between scattering mechanisms and improve upon existing decomposition techniques.
This document summarizes a study that used wide-swath interferometric synthetic aperture radar (InSAR) time series to map large-scale ground deformation over the Danakil depression in the Afar region of Ethiopia between 2006 and 2009. The time series analysis revealed deformation signals consistent with magmatic intrusions and inflation/deflation of volcanic centers. Modeling of the deformation supported deep magma intrusion beneath the central segment and lateral magma propagation and chamber inflation beneath Dabbahu volcano in the northern segment. The study demonstrated the potential of wide-swath InSAR time series for mapping long-wavelength ground deformation over large areas.
Dual-polarized synthetic aperture radar (SAR) data can be used to detect ships, oil rigs, buoys, and oil spills by analyzing the degree of polarization (DoP) within the data. The DoP is estimated from Stokes parameters, which describe the polarization of electromagnetic waves, and indicates how polarized versus depolarized the radar backscatter is from different targets. Higher DoP helps distinguish man-made objects from natural backgrounds. Experimental results show the DoP estimated from different dual-pol SAR modes can successfully identify ships and oil platforms within RADARSAT-2 and NASA/JPL UAVSAR imagery of coastal and offshore areas.
Kendall's Tau is a nonparametric correlation test used with ordinal or ranked data, like ranks in a competition. It measures the relationship between rank orders of two variables. Researchers analyzed rank orders of athletes in biking and running events of an ironman competition using Kendall's Tau due to ties in ranks. Kendall's Tau results range from -1 to 1, with values closer to the extremes indicating a stronger monotonic relationship between variables.
WE3.L09 - EVALUATION OF SYSTEM POLARIZATION QUALITY FOR POLARIMETRIC SAR IMAG...grssieee
This document evaluates polarization quality metrics for polarimetric synthetic aperture radar (PolSAR) imagery and target decomposition. It introduces the maximum normalized error (MNE) metric to quantify the effects of polarization distortions from the radar system. The MNE captures the maximum error over all possible target polarimetric responses. It provides an evaluation baseline for comparing the errors of different polarimetric decompositions under various distortion levels and configurations. Simulation results demonstrate that the MNE metric connects PolSAR image quality to interacting polarization distortions and serves as a reference for acceptable distortion levels.
TU1.L09.1 - APPLICATION OF POLARIMETRIC SAR TO EARTH REMOTE SENSINGgrssieee
Polarimetric SAR data allows for a more complete analysis of scattering mechanisms. There are various techniques for analyzing polarimetric data, including polarization signatures and eigenvector decompositions. Model-based decomposition compares observed covariance matrices to model-derived matrices to identify the dominant scattering mechanisms, such as surface, volume, or double-bounce scattering, on a pixel-by-pixel basis. This provides quantitative information about surface properties and distributions of scatterers within each resolution cell.
This document proposes a new polarimetric calibration method for SAR using forest and surface scattering targets. It models forest backscatter as consisting of volume, double bounce, and surface scattering. Two steps are used to determine channel imbalances and forest parameters: first using forest and a corner reflector, then estimating cross-talks. The method was evaluated using 26 Amazon datasets and showed stable parameters and good polarimetric signatures after calibration. Reflection symmetry was confirmed for forests.
This document discusses randomly changing radio frequency interference (RFI) detected in ALOS PALSAR data collected over the American Arctic. It presents an analysis of RFI signatures identified in the data and the development of an azimuth analysis-based notch filtering algorithm to mitigate the effects of the interference. The algorithm detects and removes RFI based on outliers identified in azimuth cuts through the range-frequency data. Results show the method successfully restores data quality and polarimetric signatures degraded by RFI from long-range radar stations. The document concludes growing RFI is an issue for microwave remote sensing and requires ongoing monitoring and mitigation efforts.
Model-based Polarimetric Decomposition using PolInSAR Coherence_v11(FILEminim...grssieee
This document presents a new model-based polarimetric decomposition technique that utilizes polarimetric SAR interferometry (PolInSAR) coherence. Current decomposition models have limitations in accurately modeling volume scattering and resolving scattering mechanism ambiguities. The proposed method introduces an adaptive volume scattering model parameterized by PolInSAR coherence. This allows the volume scattering to better fit diverse terrain types like forests and built-up areas. Experimental results on L-band PolInSAR data demonstrate the method's ability to discriminate between scattering mechanisms and improve upon existing decomposition techniques.
This document summarizes a study that used wide-swath interferometric synthetic aperture radar (InSAR) time series to map large-scale ground deformation over the Danakil depression in the Afar region of Ethiopia between 2006 and 2009. The time series analysis revealed deformation signals consistent with magmatic intrusions and inflation/deflation of volcanic centers. Modeling of the deformation supported deep magma intrusion beneath the central segment and lateral magma propagation and chamber inflation beneath Dabbahu volcano in the northern segment. The study demonstrated the potential of wide-swath InSAR time series for mapping long-wavelength ground deformation over large areas.
Dual-polarized synthetic aperture radar (SAR) data can be used to detect ships, oil rigs, buoys, and oil spills by analyzing the degree of polarization (DoP) within the data. The DoP is estimated from Stokes parameters, which describe the polarization of electromagnetic waves, and indicates how polarized versus depolarized the radar backscatter is from different targets. Higher DoP helps distinguish man-made objects from natural backgrounds. Experimental results show the DoP estimated from different dual-pol SAR modes can successfully identify ships and oil platforms within RADARSAT-2 and NASA/JPL UAVSAR imagery of coastal and offshore areas.
TU1.L09.3 - Fully Polarimetric TerraSAR-X Data: Data Quality and Scientific ...grssieee
The document summarizes research using fully polarimetric TerraSAR-X data for scientific analysis. Key findings include:
1) TerraSAR-X was able to achieve high quality polarimetric data and stable radiometric calibration over multiple years of operation.
2) Analysis of agricultural, urban, glacial and sea ice sites showed polarimetry can provide information on surface changes and feature separation in different environments.
3) Coherent scatterer detection techniques were able to identify stable radar targets in urban and glacial sites at high spatial resolution.
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
This document discusses using wavelet transforms to analyze two-point statistics of polarimetric synthetic aperture radar (PolSAR) data. It introduces wavelet variance and kurtosis as metrics that can be applied to PolSAR data transformed using a wavelet frame. It then provides an example of applying this analysis to ALOS PALSAR data over Hawaii's Papau Seamount to characterize sea surface features.
3_Terrain catergorization for single Pol.pptgrssieee
The document proposes a method for terrain categorization of single-polarization SAR images based on scattering mechanisms. It applies speckle filtering, unsupervised classification, and assigns color-codes to scattering categories (surface, volume, double-bounce) to produce categorized imagery. Comparisons with PolSAR Pauli decomposition and Google Earth show the method provides a simple way to separate terrain types from single-channel SAR, though it has limitations compared to fully polarimetric data.
TU3.L09 - AN OVERVIEW OF RECENT ADVANCES IN POLARIMETRIC SAR INFORMATION EXTR...grssieee
This document provides an overview of recent advances in polarimetric synthetic aperture radar (PolSAR) information extraction algorithms and applications presented at the 2010 IEEE International Geoscience and Remote Sensing Symposium (IGARSS). It reviews developments in target decompositions, orientation angles, classification, segmentation, texture modeling, speckle filtering, compact polarimetry, and high-resolution PolSAR over the past five years. New satellite systems like ALOS, TerraSAR-X, and RADARSAT-2 have enabled applications in areas such as agriculture, forestry, geology and oceanography.
Polarimeter digunakan untuk mengukur sudut rotasi cahaya yang diakibatkan oleh zat optik aktif seperti gula. Eksperimen mengukur kecepatan hidrolisis sakarosa menjadi fruktosa dan glukosa dengan menentukan nilai alfa awal (α0), alfa pada setiap waktu (αt), dan alfa akhir (α∞) menggunakan polarimeter. HCl digunakan untuk mempercepat reaksi hidrolisis.
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
The Sentinel-1 mission is part of the GMES program and consists of two satellites to provide C-band SAR data for emergency response, marine and land monitoring, and other applications. The satellites operate in a near-polar orbit with a 12 day repeat cycle. The main acquisition mode is an interferometric wide swath mode with 5m range and 20m azimuth resolution over a 250km swath. Sentinel-1 will support operational services and create a long-term SAR data archive.
This document presents a new method for detecting land cover changes using RADARSAT-2 polarimetric SAR (PolSAR) images. The method integrates change vector analysis, post-classification comparison, and object-oriented image analysis. It was tested on PolSAR images of an area in China that experiences illegal land development. The proposed method achieved higher accuracy than other techniques in determining various types of land cover changes, such as barren land converting to crops or built-up areas. The use of change vector analysis, post-classification comparison, and object-oriented analysis helped reduce false alarms from classification errors and environmental changes.
A polarimeter is an instrument used to measure the angle of rotation caused when polarized light passes through an optically active substance. It consists of a polarimeter tube and operation panel. When light passes through a left-handed or right-handed sample, the translucent semicircular fields in the polarimeter gradually change. There are different types of polarimeters. The specific rotation, a unique property of substances, can be calculated using the measured angle of rotation, concentration, temperature, and length of the sample cell. Polarimeters are used in industries like chemistry, food, beverages, and pharmaceuticals for applications such as quality control and purity measurements.
This document provides an analysis of the handwriting and signatures of three famous individuals: Mukesh Ambani, Anil Ambani, and Ratan Tata. For each person, key personality traits are inferred based on characteristics of their signature such as slant, letter formations, size, and strokes. Mukesh Ambani is described as logical and focused on results while Anil Ambani is portrayed as emotional and defensive. Ratan Tata is highlighted as persistent, determined, and a dreamer. The document suggests that signature analysis can provide insights into one's self-image, social skills, and personality.
The document discusses principles of radar imaging and synthetic aperture radar (SAR). SAR uses signal modulation and range-Doppler processing to achieve high-resolution radar imagery independent of distance to targets. Polarimetric SAR can characterize target scattering properties by measuring the scattering matrix. Interferometric SAR uses two antennas to measure elevation, while differential interferometry detects elevation changes over time for applications like change detection. Emerging techniques include polarimetric interferometry and using polarization signatures to estimate surface tilt and topography.
This document provides information about an upcoming training course on advanced synthetic aperture radar (SAR) processing being offered by the Applied Technology Institute (ATI). The 2-day course will be held on May 6-7, 2009 in Chantilly, Virginia and will be instructed by Bart Huxtable. It will cover topics such as SAR review origins, basic and advanced SAR processing techniques, interferometric SAR, spotlight mode SAR, and polarimetric SAR. The course outline and schedule are provided along with instructor biographies and registration information. Additionally, the document advertises ATI's ability to provide on-site customized training courses.
This document discusses synthetic aperture radar (SAR) and its use in remote sensing applications. SAR uses signal processing to simulate a large physical antenna on an airborne or spaceborne platform. As the platform moves, SAR collects and combines radar return signals to generate high-resolution imagery of the terrain below. Key aspects of SAR discussed include cross-range resolution, sequential generation of the synthetic antenna aperture, and phase correction to focus the SAR image. Applications mentioned include military reconnaissance, oceanography, geology, surveillance, and environmental monitoring.
Radar 2009 a 18 synthetic aperture radarForward2025
This document provides an overview of a lecture on synthetic aperture radar (SAR). It begins with an introduction to SAR, including why it was developed due to limitations of conventional radar for imaging. It then discusses the basics of SAR and how it forms images using signal processing to synthesize a large antenna aperture. The document outlines the rest of the lecture topics which will cover SAR image formation techniques, examples, applications, and a history of the evolution of SAR from its origins in the 1950s to current systems.
This document discusses synthetic aperture radar (SAR) and pulse compression techniques. It explains that pulse compression allows radar systems to achieve fine range resolution using long duration, low power pulses by modulating the pulses with linear frequency modulation (chirp) and then correlating the received signal with a reference chirp. This improves the signal to noise ratio compared to using short pulses directly. The document covers topics such as range resolution, pulse compression, chirp waveforms, stretch processing, correlation processing, window functions, and how pulse compression affects signal to noise ratio and blind range.
This document discusses polarimetry, which is the study of the rotation of polarized light by optically active substances. Polarimetry can be used to both identify and quantify compounds based on their ability to rotate plane-polarized light clockwise or counterclockwise. The document outlines the principles of polarimetry using optically active compounds and the instrumentation of a polarimeter. Applications of polarimetry include identification of compounds, determination of optical activity, and uses in the chemical, food, beverage, pharmaceutical, and sugar industries for purity testing and concentration measurements.
Polarimetry is the study of the rotation of polarized light by transparent substances. Plane polarized light consists of two components rotating in opposite directions. When an optically active substance is placed in the path of plane polarized light, it rotates the plane of polarization. The magnitude of rotation depends on factors like the nature and concentration of the substance, temperature, and wavelength of light. Polarimeters are used to detect and measure optical activity by determining the angle of rotation of plane polarized light passing through a sample.
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
1) The document describes a segmentation algorithm for polarimetric SAR (PolSAR) data that can model both scalar-texture and multi-texture scattering.
2) The algorithm uses log-cumulants and hypothesis testing to determine whether a scalar-texture or dual-texture model best fits the data within each segment.
3) The algorithm is tested on simulated multi-texture PolSAR data and is shown to accurately segment the classes and estimate their texture parameters. However, when applied to real data sets, the algorithm only finds the simpler scalar-texture case.
The document summarizes the status of the GMES Space Component program. It describes the Sentinel satellite missions for monitoring land, ocean, atmosphere and emergency situations. The Sentinels will provide long-term data continuity as well as improved coverage compared to existing missions. Sentinel data will be freely and openly available to both operational users and the science community. The program is on track, with the first Sentinel launches beginning in 2013.
TU1.L09.3 - Fully Polarimetric TerraSAR-X Data: Data Quality and Scientific ...grssieee
The document summarizes research using fully polarimetric TerraSAR-X data for scientific analysis. Key findings include:
1) TerraSAR-X was able to achieve high quality polarimetric data and stable radiometric calibration over multiple years of operation.
2) Analysis of agricultural, urban, glacial and sea ice sites showed polarimetry can provide information on surface changes and feature separation in different environments.
3) Coherent scatterer detection techniques were able to identify stable radar targets in urban and glacial sites at high spatial resolution.
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
This document discusses using wavelet transforms to analyze two-point statistics of polarimetric synthetic aperture radar (PolSAR) data. It introduces wavelet variance and kurtosis as metrics that can be applied to PolSAR data transformed using a wavelet frame. It then provides an example of applying this analysis to ALOS PALSAR data over Hawaii's Papau Seamount to characterize sea surface features.
3_Terrain catergorization for single Pol.pptgrssieee
The document proposes a method for terrain categorization of single-polarization SAR images based on scattering mechanisms. It applies speckle filtering, unsupervised classification, and assigns color-codes to scattering categories (surface, volume, double-bounce) to produce categorized imagery. Comparisons with PolSAR Pauli decomposition and Google Earth show the method provides a simple way to separate terrain types from single-channel SAR, though it has limitations compared to fully polarimetric data.
TU3.L09 - AN OVERVIEW OF RECENT ADVANCES IN POLARIMETRIC SAR INFORMATION EXTR...grssieee
This document provides an overview of recent advances in polarimetric synthetic aperture radar (PolSAR) information extraction algorithms and applications presented at the 2010 IEEE International Geoscience and Remote Sensing Symposium (IGARSS). It reviews developments in target decompositions, orientation angles, classification, segmentation, texture modeling, speckle filtering, compact polarimetry, and high-resolution PolSAR over the past five years. New satellite systems like ALOS, TerraSAR-X, and RADARSAT-2 have enabled applications in areas such as agriculture, forestry, geology and oceanography.
Polarimeter digunakan untuk mengukur sudut rotasi cahaya yang diakibatkan oleh zat optik aktif seperti gula. Eksperimen mengukur kecepatan hidrolisis sakarosa menjadi fruktosa dan glukosa dengan menentukan nilai alfa awal (α0), alfa pada setiap waktu (αt), dan alfa akhir (α∞) menggunakan polarimeter. HCl digunakan untuk mempercepat reaksi hidrolisis.
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
The Sentinel-1 mission is part of the GMES program and consists of two satellites to provide C-band SAR data for emergency response, marine and land monitoring, and other applications. The satellites operate in a near-polar orbit with a 12 day repeat cycle. The main acquisition mode is an interferometric wide swath mode with 5m range and 20m azimuth resolution over a 250km swath. Sentinel-1 will support operational services and create a long-term SAR data archive.
This document presents a new method for detecting land cover changes using RADARSAT-2 polarimetric SAR (PolSAR) images. The method integrates change vector analysis, post-classification comparison, and object-oriented image analysis. It was tested on PolSAR images of an area in China that experiences illegal land development. The proposed method achieved higher accuracy than other techniques in determining various types of land cover changes, such as barren land converting to crops or built-up areas. The use of change vector analysis, post-classification comparison, and object-oriented analysis helped reduce false alarms from classification errors and environmental changes.
A polarimeter is an instrument used to measure the angle of rotation caused when polarized light passes through an optically active substance. It consists of a polarimeter tube and operation panel. When light passes through a left-handed or right-handed sample, the translucent semicircular fields in the polarimeter gradually change. There are different types of polarimeters. The specific rotation, a unique property of substances, can be calculated using the measured angle of rotation, concentration, temperature, and length of the sample cell. Polarimeters are used in industries like chemistry, food, beverages, and pharmaceuticals for applications such as quality control and purity measurements.
This document provides an analysis of the handwriting and signatures of three famous individuals: Mukesh Ambani, Anil Ambani, and Ratan Tata. For each person, key personality traits are inferred based on characteristics of their signature such as slant, letter formations, size, and strokes. Mukesh Ambani is described as logical and focused on results while Anil Ambani is portrayed as emotional and defensive. Ratan Tata is highlighted as persistent, determined, and a dreamer. The document suggests that signature analysis can provide insights into one's self-image, social skills, and personality.
The document discusses principles of radar imaging and synthetic aperture radar (SAR). SAR uses signal modulation and range-Doppler processing to achieve high-resolution radar imagery independent of distance to targets. Polarimetric SAR can characterize target scattering properties by measuring the scattering matrix. Interferometric SAR uses two antennas to measure elevation, while differential interferometry detects elevation changes over time for applications like change detection. Emerging techniques include polarimetric interferometry and using polarization signatures to estimate surface tilt and topography.
This document provides information about an upcoming training course on advanced synthetic aperture radar (SAR) processing being offered by the Applied Technology Institute (ATI). The 2-day course will be held on May 6-7, 2009 in Chantilly, Virginia and will be instructed by Bart Huxtable. It will cover topics such as SAR review origins, basic and advanced SAR processing techniques, interferometric SAR, spotlight mode SAR, and polarimetric SAR. The course outline and schedule are provided along with instructor biographies and registration information. Additionally, the document advertises ATI's ability to provide on-site customized training courses.
This document discusses synthetic aperture radar (SAR) and its use in remote sensing applications. SAR uses signal processing to simulate a large physical antenna on an airborne or spaceborne platform. As the platform moves, SAR collects and combines radar return signals to generate high-resolution imagery of the terrain below. Key aspects of SAR discussed include cross-range resolution, sequential generation of the synthetic antenna aperture, and phase correction to focus the SAR image. Applications mentioned include military reconnaissance, oceanography, geology, surveillance, and environmental monitoring.
Radar 2009 a 18 synthetic aperture radarForward2025
This document provides an overview of a lecture on synthetic aperture radar (SAR). It begins with an introduction to SAR, including why it was developed due to limitations of conventional radar for imaging. It then discusses the basics of SAR and how it forms images using signal processing to synthesize a large antenna aperture. The document outlines the rest of the lecture topics which will cover SAR image formation techniques, examples, applications, and a history of the evolution of SAR from its origins in the 1950s to current systems.
This document discusses synthetic aperture radar (SAR) and pulse compression techniques. It explains that pulse compression allows radar systems to achieve fine range resolution using long duration, low power pulses by modulating the pulses with linear frequency modulation (chirp) and then correlating the received signal with a reference chirp. This improves the signal to noise ratio compared to using short pulses directly. The document covers topics such as range resolution, pulse compression, chirp waveforms, stretch processing, correlation processing, window functions, and how pulse compression affects signal to noise ratio and blind range.
This document discusses polarimetry, which is the study of the rotation of polarized light by optically active substances. Polarimetry can be used to both identify and quantify compounds based on their ability to rotate plane-polarized light clockwise or counterclockwise. The document outlines the principles of polarimetry using optically active compounds and the instrumentation of a polarimeter. Applications of polarimetry include identification of compounds, determination of optical activity, and uses in the chemical, food, beverage, pharmaceutical, and sugar industries for purity testing and concentration measurements.
Polarimetry is the study of the rotation of polarized light by transparent substances. Plane polarized light consists of two components rotating in opposite directions. When an optically active substance is placed in the path of plane polarized light, it rotates the plane of polarization. The magnitude of rotation depends on factors like the nature and concentration of the substance, temperature, and wavelength of light. Polarimeters are used to detect and measure optical activity by determining the angle of rotation of plane polarized light passing through a sample.
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
1) The document describes a segmentation algorithm for polarimetric SAR (PolSAR) data that can model both scalar-texture and multi-texture scattering.
2) The algorithm uses log-cumulants and hypothesis testing to determine whether a scalar-texture or dual-texture model best fits the data within each segment.
3) The algorithm is tested on simulated multi-texture PolSAR data and is shown to accurately segment the classes and estimate their texture parameters. However, when applied to real data sets, the algorithm only finds the simpler scalar-texture case.
The document summarizes the status of the GMES Space Component program. It describes the Sentinel satellite missions for monitoring land, ocean, atmosphere and emergency situations. The Sentinels will provide long-term data continuity as well as improved coverage compared to existing missions. Sentinel data will be freely and openly available to both operational users and the science community. The program is on track, with the first Sentinel launches beginning in 2013.
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
The document describes the progress of the development of CFOSAT SCAT, a Ku-band scatterometer onboard the Chinese-French Oceanography Satellite (CFOSAT). CFOSAT will measure global ocean surface winds and waves to improve weather forecasting, ocean dynamics modeling, climate research, and understanding of surface processes. The SCAT instrument is a rotating fan-beam radar scatterometer that will retrieve wind vectors using measurements of backscatter at incidence angles from 26 to 46 degrees. It has a wide swath of over 1000km and specifications are designed to achieve high-precision wind measurements globally. System details including parameters and the operation mode are provided.
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
The document describes the SAP4PRISMA project which aims to develop algorithms and products to support the Italian hyperspectral PRISMA Earth observation mission. The project will focus on data processing, quality assessment, classification methods, and generating level 3 and 4 products for applications like land monitoring, agriculture, and hazard monitoring. It will include the generation of "PRISMA-like" synthetic test data to support algorithm development and validation. The research will be carried out across multiple work packages focusing on topics like data quality, classification methods, calibration/validation, and developing applicative products.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different sites, including a Zambian woodland and North Carolina forest sites.
3) Time series of Hyperion data at flux tower sites show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different forest, grassland, and woodland sites globally.
3) Time series of Hyperion data at sites in Zambia, North Carolina, and Kansas show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release measured by flux towers.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
EO-1/Hyperion has been collecting hyperspectral imagery for over 12 years, acquiring over 65,000 scenes. Researchers have been using these data to develop and validate algorithms for estimating vegetation properties like fraction of absorbed photosynthetically active radiation (fAPAR) and photochemical reflectance index (PRI). Comparisons of Hyperion data to field measurements at flux tower sites show these algorithms can accurately track vegetation changes over time and relate spectral properties to productivity metrics like light use efficiency and gross ecosystem productivity. This work is helping prototype data products for the upcoming HyspIRI mission.
This document is a return and exchange form for a wetsuit company. It provides instructions for customers to fill out when returning an undamaged item for a refund, exchange, or size change. The form requests information like the customer's order details, contact information, the suit being returned and its size, the reason for return, and if applicable, the new desired size. It also provides the return shipping address and notifies customers that the company is not responsible for lost or damaged return packages.
This document provides instructions for clients of Fox Tax Planning and Preparation for preparing to have their taxes filed. It lists important income and deduction documentation to bring to an appointment, such as W-2s, 1099s, receipts for donations. It also includes an engagement letter detailing the services to be provided, responsibilities of both parties, fees, and electronic filing and signature procedures. Clients are asked to sign the letter agreeing to the terms and return it along with their tax information.
The document discusses mapping wetlands in North America using MODIS 500m imagery. It describes wetlands and existing global wetland databases. The methodology uses MODIS data from 2008, digital elevation models, and reference data to classify wetlands into three types - forest/shrub dominant wetlands, herbaceous dominant wetlands, and sea grass dominant wetlands. Training data is collected from existing land cover maps and Landsat imagery. A decision tree model and maximum likelihood classification are applied to extract wetlands from other land covers.
The document summarizes research using SBAS-DInSAR (Small BAseline Subset differential interferometric synthetic aperture radar) techniques to analyze ground deformation at Mt. Etna volcano in Italy over the last 18 years using ERS and ENVISAT satellite data. The analysis revealed three main deformation processes: inflation of the volcanic edifice, subsidence of sectors on the eastern flank due to gravitational spreading, and deflation-inflation cycles associated with eruptive and post-eruptive activity. More recent analysis using higher resolution COSMO-SkyMed data from 2009-2010 detected deformation related to faults and a 2010 earthquake more precisely than lower resolution ENVISAT data.
This study analyzed crustal deformation in the seismically active Patras Gulf region of Greece using GPS data. The GPS network was established in 1994 and expanded through additional campaigns in 1996, 2006, and 2011. The data show the Patras Gulf is opening up at a rate of 8-13 mm/yr with uplift of 5 mm/yr in the south. A continuous GPS station revealed southeast horizontal motion of 17.4 mm/yr and a clear signal from the 2008 Mw 6.4 Andravida earthquake. The results indicate increasing extension across the gulf and smaller extension near the Rio-Antirrio bridge area.
The document discusses advances in SAR interferometry over the past 20 years for measuring millimeter-scale land motion. Key points include:
1) Revisit times have decreased from 35 days with ERS-1 to 12 days with Sentinel-1 constellations, improving ground motion recovery.
2) Persistent scatterer interferometry techniques like SqueeSAR can now measure motions to the millimeter by using all available interferograms.
3) Atmospheric phase screens still limit accuracy but can be estimated and removed using numerical weather models, GPS, and other independent datasets.
4) Future opportunities include using wide Doppler bandwidths from satellites to achieve high-resolution azimuth measurements of ground motion.
This document summarizes a study of ground deformation on Cephalonia Island in western Greece using GPS and satellite interferometry data. The study analyzed data from 24 GPS benchmarks installed on the island from 2001-2010, as well as radar images from 1992-2000 and 2003-2008. The results show velocity fields and acceleration patterns, with some areas experiencing subsidence of up to 12 mm/yr. Time series analysis identified critically deforming areas that correlate with observed seismic energy release. Estimates of critical time based on accelerating strain and seismicity patterns suggest increased risk of future seismic events in these areas.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/how-axelera-ai-uses-digital-compute-in-memory-to-deliver-fast-and-energy-efficient-computer-vision-a-presentation-from-axelera-ai/
Bram Verhoef, Head of Machine Learning at Axelera AI, presents the “How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision” tutorial at the May 2024 Embedded Vision Summit.
As artificial intelligence inference transitions from cloud environments to edge locations, computer vision applications achieve heightened responsiveness, reliability and privacy. This migration, however, introduces the challenge of operating within the stringent confines of resource constraints typical at the edge, including small form factors, low energy budgets and diminished memory and computational capacities. Axelera AI addresses these challenges through an innovative approach of performing digital computations within memory itself. This technique facilitates the realization of high-performance, energy-efficient and cost-effective computer vision capabilities at the thin and thick edge, extending the frontier of what is achievable with current technologies.
In this presentation, Verhoef unveils his company’s pioneering chip technology and demonstrates its capacity to deliver exceptional frames-per-second performance across a range of standard computer vision networks typical of applications in security, surveillance and the industrial sector. This shows that advanced computer vision can be accessible and efficient, even at the very edge of our technological ecosystem.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
Discover top-tier mobile app development services, offering innovative solutions for iOS and Android. Enhance your business with custom, user-friendly mobile applications.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
A STABLE MODEL-BASED THREE-COMPONENT DECOMPOSITION APPROACH FOR POLARIMETRIC SAR DATA
1. A STABLE MODEL-BASED THREE-COMPONENT DECOMPOSITION APPROACH FOR POLARIMETRIC SAR DATA Zhihao Jiao, Jiong Chen, Jian Yang Tsinghua University
2.
3.
4.
5. Stable decomposition To measure the stability of decomposition: Stable: A=min {a} exists, and, A is small (e.g., A<5 ) For standard Freeman decomposition, “a” is limitless
First, I will introduce the polarimetric decomposition approaches developed in recent decades, and the Freeman decomposition, witch is a very effective incoherent decomposition. Second, the stability of a decomposition will be discussed. Then we can confirm if a decomposition approach is stable or unstable. Third, the negative powers in the freeman decomposition will be describe, which is not consistent with actual scattering mechanism. Forth, an improved three-component decomposition approach will be proposed here, and will show some experiment results. The last part is the Conclusion and expectation.
Target decomposition methods are useful tools for the interpretation of PolSAR images, by decomposing polarimetric SAR data into several parts, which represent different scattering mechanisms. We can classify these decomposition method into 2 classes, coherent decompositions , and incoherent decompositions . Coherent decomposition methods decompose the sinclair scattering matrix S, including Pauli decomposition, Krogager decomposition, Cameron decomposition and SSCM decomposition. Incoherent decomposition methods decompose the coherency matrix, covariance matrix, Mueller matrix or Stokes matrix, including ….
The Freeman decomposition decomposes a coherency matrix into surface-scattering, dihedral-scattering, and volume-scattering parts The elements T13,T23,T31 and T32 are ignored. Alpha and beta are model parameters of dihedral and surface scatterings respectively, with absolute values less than 1. And fs, fd , and fv represent the contributions of surface, dihedral and volume scatterings.
In Jian Yang‘s Ph.D. Dissertation, A stable decomposition is defined as the result of the decomposition is not sensitive to noise. After three-component decomposition, a coherency matrix is decomposed to three parts, T1, T2, and T3. We add a noise matrix delta T to the initial coherency matrix, the new coherency is also decomposed to T1’, T2’ and T3’. We can define the noise sensitivity factor like this:…, it’s the ratio of sum of Ti’s error, measured in Frobenius norm, over the Frobenius norm of delta T. If the minimum of noise sensitivity factor A exists, and A is small, e.g., A is smaller than 5, the decomposition can be called stable. Unfortunately, for standard freeman decomposition, noise sensitivity factor is limitless, so freeman decomposition is not stable.
Here is an example. We add a small noise matrix delta T to the initial coherency matrix T. We can see that, the variation of the decomposition results is much larger comparing with the variation on the observed scattering matrix. So, obviously, the standard Freeman decomposition is not a stable decomposition. The instability of the freeman decomposition may cause drastic change in the decomposition results especially in a landform area, especially where the power of surface-scattering is nearly as strong as that of dihedral-scattering.
This is a comparation between Pauli decomposition and Freeman decomposition. The AIRSAR L-band multilook polarimetric data of San Francisco is used, and the image size is 1024*900 . In the ocean-beach area, the powers of surface-scattering and dihedral-scattering are assumed to be close to each other. Because the wave become bigger as the distance to the shore decrease, the power of dihedral-scattering increase, relative to surface-scattering. So in the Pauli decomposition image, Red color component increase smoothly from the ocean to the beach. But the transition is more dramatic in Freeman decomposition image . In the ocean, the power of surface-scattering in the ocean is over estimated and the power of dihedral-scattering is under estimated. In the beach area, situation is opposite. We can see a clear bounds in the continuously descending beach area . It’s caused by the instability of freeman decomposition.
Let’s look at the Calculation of Freeman Decomposition to find the reason of the instability of freeman decomposition. Here is the equations to solve the decomposition, 4 functions with 5 variables. After simplification, we get 3 functions with 4 variables. To solve this underdetermined problem, in the Freeman decomposition, as shown in the process mapping, one model parameter of surface-scattering or dihedral-scattering is fixed to a certain value. So, the calculation of Freeman decomposition is an ill-posed problem, which leads to the instability of the decomposition result in some cases. An ill-posed problem refers to an problem without a only solution, or the only solution is not stable. Though the calculation of freeman decomposition is determined, but the process of comparing T11-2T33 and T22-T33 causes the solution is not stable.
The emergence of negative powers is another problem of the Freeman decomposition. There are three reasons which will cause the powers of surface-scattering or dihedral-scattering to be negative, as follows. However, after decomposition, the powers of sub-scatterings are supposed to be nonnegative, so the emergence of negative powers is not consistent with actual scattering mechanism.
A regularization method named Tikhonov regularization is commonly used to solve ill-posed problems for its integrity in theory and simplicity in implementation. So, we will employ it to stabilize the decomposition by adding a regularization term in the initially ill-posed problem. At the same time, the negative powers can also be eliminated. regularization method uses a series of well-posed problems with a parameter to approximate the ill-posed problem. With a appropriate parameter, the ill-posed problem could be solved well. Tikhonov regularization is a regularization method to minimize a smoothing function as follows, J(x)= 。。。 Where y-F(x) represents the ill-posed problem and omega(x) is a nonnegative function, to measure the “size” of the regularized solution. Lambda is the regularization parameter. It’s user selected, which can be regarded as a tuning value for the influence of the regulation term.
We usually use prior information to decide the omega(x). So we choose 。。。 as omega(x) , since with no prior information about the ground truth, it is inclined to select small modulus of alpha and beta as the possible solution. This regularization term also coincides with the standard Freeman decomposition in which alpha or beta is fixed to be 0 when the corresponding scattering power is secondary. So the objective function is as this, where delta T is the difference between the measured coherency matrix and the coherency matrix calculated by the candidate solution.
Then we discuss selection of the regularization parameter. The most effective value is determined by the intensity of noise, which is usually not easy to be estimated accurately. So, the regularization parameter could be fixed to a constant for a image, but it’s better to choose a more effective value for every pixel. In our proposed approach, the L-curve method is employed. It is a posterior rule to find an almost most effective by calculating the curvature of the L-curve. L-curve is a curve drawn with 。。。 as the abscissa and 。。。。 as the ordinate 。 Here X0 is the optimal solution with a candidate regularization parameter. At the position corresponding the best lambda, the L-curve has the largest curvature.
Here shows the decomposition with fixed regularization parameters. The lower two figures are the proposed decomposition results with different fixed regularization parameter, λ =0.1 and λ =6 , respectively. Upper two figures are results of pauli decomposition and freeman decomposition to compare easily. Comparing with the result of Freeman decomposition, the powers of dihedral-scattering in the urban area shown in lower figures are enhanced, which is helpful for us to classify urban area and forest area. However, the dihedral-scattering powers in the ocean area are overestimated at the same time, especially when λ =6 . So, it is necessary to employ a regularization parameter selecting method to choose an effective λ for every pixel automatically.
Here is the procedure of the proposed approach. The deorientation process is applied to the coherency matrix before all decompositions to reduce the overestimation of the volume -scattering power. Then we compare the absolute value of 。。。 And 。。。 When there is significant difference between the values of 。。。 And 。。。 , for example, one is 5 times bigger than the other, we use original Freeman decomposition for it’s small computing complexity Otherwise the proposed approach is used. To solve the optimization problem, we can use the quasi-Newton method. By introducing constraints such as fs>=0 。。。 , all negative powers are eliminated.
Here is the experiment results. Left figure is result of freeman decomposition and right figure is the proposed approach’s result. Red for dihedral-scattering , blue for surface-scattering , green for volume-scattering. It can be seen, that, by using the improved decomposition approach, the powers of dihedral-scattering are increased in the urban area and the behavior of backscatter in the ocean and forest area is described well. Therefore the detection performance of man-made targets in the forest area could be improved if using the proposed approach. Moreover, in the ocean-beach area boxed out. It is shown that the color from the beach to the ocean changes smoothly in right figure while the transition is more dramatic in left figure. This is an indication that the proposed decomposition approach is more stable . By constraining the sub-scattering powers to positive values, no pixel with negative powers exist in the result of the proposed approach, while more than one fifth pixels have negative powers if using original Freeman decomposition.
Then let us make a conclusion. To solve the instability issue of the Freeman-decomposition and eliminate the negative powers, an improved model-based three-component decomposition approach is developed. By minimizing the continuous objective function consisting of the error of coherency matrix and the regularization term, the decomposition result is more stable. The negative powers are also eliminated by introducing constraints to the solution domain of decomposition. Because we ignore neither alpha nor beta when the powers of surface-scattering and dihedral-scattering are close, surface-scattering and dihedral-scattering are considered equally. This implies the decomposition is also more reliable. However, the proposed approach has larger computing complexity for the calculation of optimization. Moreover, The decomposition based on regularization method has more margin of improvement, for example, in areas with different scattering components, we fix different regularization parameters relative to alpha and beta. Regularization term as omega = 。。。。 may be effective.