The document presents an approach to detecting land cover change using temporal autocorrelation analysis of hyper-temporal satellite imagery. The methodology develops an autocorrelation-based change metric to distinguish real changes from natural phenological cycles. The optimal autocorrelation lag and detection threshold are determined using simulated change datasets. When applied to MODIS imagery of settlements in Gauteng, South Africa, the approach detected over 92% of real changes with an overall accuracy of 88%.
The document summarizes a study that evaluates the uncertainties in global moderate resolution Leaf Area Index (LAI) products derived from satellite data, including MODIS and CYCLOPES. The study uses a global database of 219 field LAI measurements from 129 sites to directly validate the satellite products. Results show that while MODIS LAI estimates have improved across product versions, current LAI products still have uncertainties of around ±1.0, which does not meet the ±0.5 accuracy requirement set by GCOS. Future work is needed to reduce uncertainties, especially for certain biomes and conditions.
This document summarizes research assessing the use of X-ray fluorescence (XRF) technique to non-invasively measure the percentage of silver (%Ag) in tin-silver (SnAg) solders used in flip chip interconnects. Key findings include:
1) XRF measurements of %Ag in SnAg solders were found to be affected by bump geometry, composition, underlying chip wiring and tool parameters. Accuracy improved by using known calibration standards of the same dimensions as the bumps measured.
2) Flattening bump tops through stamping improved measurement stability and accuracy for bumps 70μm high or less by reducing X-ray scattering effects.
3) Background noise from underlying chip wiring
Indufor ..forest intelligence ..remote sensing and gis .... bau with a bit mo...Nelson Gapare
Remote sensing and GIS tools are now essential for forest resource assessment and management due to decreased data costs and improved resolution. These tools can help address challenges of monitoring isolated or non-compliant activities cost-effectively. Indufor provides remote sensing services including resource monitoring, compliance assessment, and mapping for projects involving REDD+, carbon, and certification. Services typically cost $0.30-0.40 per hectare and provide automated detection of changes like harvesting or encroachment.
The document discusses sustainable forest management practices in the Kampar Ring region of Indonesia. It outlines plans for a plantation ring consisting of production forests, conservation areas, and livelihood zones. It also discusses community programs, certification initiatives, efforts to prevent illegal logging and fires, and research projects aimed at understanding peatland hydrology and carbon dynamics to minimize emissions. Sustainable management of peatlands is presented as key to protecting biodiversity, providing ecosystem services, and supporting local communities and economies in the long run.
Change detection analysis in land use / land cover of Pune city using remotel...Nitin Mundhe
Lecture delivered in the National Conference entitled “Monitoring Degraded Lands” jointly organized by Agasti Arts, Commerce and Dadasaheb Rupwate Science
College, Akole and Maharashtra Bhugolshastra Parishad Pune to be held on 4 to 6 February 2014.
This document provides an overview of plantation agriculture and oil palm plantations in Malaysia. It describes the key characteristics of plantation agriculture, including large land sizes and commercial single crop production. The document then discusses the historical development and current ownership structures of plantations. It also outlines the distribution, importance and challenges of oil palm plantations in Malaysia, the largest global producer, including their processes from cultivation to processing.
This document describes a student project implementing speech recognition for desktop applications. It was completed by three students - Sarang Afle, Sneh Joshi, and Surbhi Sharma - for their computer science degree under the supervision of Professor Nitesh Rastogi. The project involved developing a speech recognition software that allows users to operate a computer through voice commands.
The document summarizes a study that evaluates the uncertainties in global moderate resolution Leaf Area Index (LAI) products derived from satellite data, including MODIS and CYCLOPES. The study uses a global database of 219 field LAI measurements from 129 sites to directly validate the satellite products. Results show that while MODIS LAI estimates have improved across product versions, current LAI products still have uncertainties of around ±1.0, which does not meet the ±0.5 accuracy requirement set by GCOS. Future work is needed to reduce uncertainties, especially for certain biomes and conditions.
This document summarizes research assessing the use of X-ray fluorescence (XRF) technique to non-invasively measure the percentage of silver (%Ag) in tin-silver (SnAg) solders used in flip chip interconnects. Key findings include:
1) XRF measurements of %Ag in SnAg solders were found to be affected by bump geometry, composition, underlying chip wiring and tool parameters. Accuracy improved by using known calibration standards of the same dimensions as the bumps measured.
2) Flattening bump tops through stamping improved measurement stability and accuracy for bumps 70μm high or less by reducing X-ray scattering effects.
3) Background noise from underlying chip wiring
Indufor ..forest intelligence ..remote sensing and gis .... bau with a bit mo...Nelson Gapare
Remote sensing and GIS tools are now essential for forest resource assessment and management due to decreased data costs and improved resolution. These tools can help address challenges of monitoring isolated or non-compliant activities cost-effectively. Indufor provides remote sensing services including resource monitoring, compliance assessment, and mapping for projects involving REDD+, carbon, and certification. Services typically cost $0.30-0.40 per hectare and provide automated detection of changes like harvesting or encroachment.
The document discusses sustainable forest management practices in the Kampar Ring region of Indonesia. It outlines plans for a plantation ring consisting of production forests, conservation areas, and livelihood zones. It also discusses community programs, certification initiatives, efforts to prevent illegal logging and fires, and research projects aimed at understanding peatland hydrology and carbon dynamics to minimize emissions. Sustainable management of peatlands is presented as key to protecting biodiversity, providing ecosystem services, and supporting local communities and economies in the long run.
Change detection analysis in land use / land cover of Pune city using remotel...Nitin Mundhe
Lecture delivered in the National Conference entitled “Monitoring Degraded Lands” jointly organized by Agasti Arts, Commerce and Dadasaheb Rupwate Science
College, Akole and Maharashtra Bhugolshastra Parishad Pune to be held on 4 to 6 February 2014.
This document provides an overview of plantation agriculture and oil palm plantations in Malaysia. It describes the key characteristics of plantation agriculture, including large land sizes and commercial single crop production. The document then discusses the historical development and current ownership structures of plantations. It also outlines the distribution, importance and challenges of oil palm plantations in Malaysia, the largest global producer, including their processes from cultivation to processing.
This document describes a student project implementing speech recognition for desktop applications. It was completed by three students - Sarang Afle, Sneh Joshi, and Surbhi Sharma - for their computer science degree under the supervision of Professor Nitesh Rastogi. The project involved developing a speech recognition software that allows users to operate a computer through voice commands.
New Software Methods Enhance Sedimentation Velocity Analysis of Protein Aggre...KBI Biopharma
1) New software methods have improved the resolution and sensitivity of sedimentation velocity analysis, making it useful for comparability testing, formulation development, and quality control.
2) A method developed by Peter Schuck increases resolution of multiple components and detects very minor components below 1%. It provides higher resolution than SEC columns.
3) Another new method allows obtaining conformation and mass information at extremely low concentrations below 1 microgram total protein.
Measuring Comparability of Conformation, Heterogeneity and Aggregation with C...KBI Biopharma
"Measuring Comparability of Conformation, Heterogeneity, and Aggregation with Circular Dichroism and Analytical Ultracentrifugation", invited talk, State of the Art Methods for the Characterization of Biological Products and Assessment of Comparability, NIH, June 2003
This document provides an overview of small field reference dosimetry. It discusses the challenges of small field dosimetry due to lack of lateral electron equilibrium and detector size limitations. It describes the IAEA framework for reference dosimetry of small and non-standard fields, including determining equivalent square fields and reference conditions. Practical procedures for determining output factors and reference dosimetry for devices like the Cyberknife and Gammaknife are outlined. The importance of relative and in-vivo dosimetry for complex treatments using small fields is also covered.
Cooper Environmental’s Xact® 625i is designed for high time resolution
multi-metals monitoring of ambient air, with detection limits that rival
those of laboratory analysis. The Xact® 625i comes standard with a
solid-state meteorological sensor and Cooper Environmental’s proprietary ADAPT analysis package, making the instrument one of the most
powerful air pollution source detection offerings in the industry
1. The document proposes a novel adaptive optics approach combining conventional and sensorless adaptive optics to optimize coupling between an adaptive optics system and single mode fibers.
2. It describes a laboratory experiment demonstrating this approach, achieving 56% coupling efficiency by correcting non-common path aberrations in static mode and maintaining stability during turbulence.
3. The approach was also tested on a 1.5m telescope, improving starlight coupling from 11% to 63% by applying the sensorless correction in static mode.
This document summarizes research on enhancing gene expression programming (GEP) for Reynolds-averaged Navier-Stokes equations turbulence modeling with unsupervised clustering. It presents a GEP-enhanced multi-model framework that uses feature selection, dimensionality reduction, and clustering to assign different turbulence models to distinct regions of a flow, improving simulation accuracy. Results show the approach produces more accurate mean velocities and Reynolds stresses for a body-of-revolution testcase compared to baseline and GEP-driven models. Ongoing work includes optimizing the framework configuration and extending it to 3D domains.
A novel auto-tuning method for fractional order PID controllersISA Interchange
Fractional order PID controllers benefit from an increasing amount of interest from the research community due to their proven advantages. The classical tuning approach for these controllers is based on specifying a certain gain crossover frequency, a phase margin and a robustness to gain variations. To tune the fractional order controllers, the modulus, phase and phase slope of the process at the imposed gain crossover frequency are required. Usually these values are obtained from a mathematical model of the process, e.g. a transfer function. In the absence of such model, an auto-tuning method that is able to estimate these values is a valuable alternative. Auto-tuning methods are among the least discussed design methods for fractional order PID controllers. This paper proposes a novel approach for the auto-tuning of fractional order controllers. The method is based on a simple experiment that is able to determine the modulus, phase and phase slope of the process required in the computation of the controller parameters. The proposed design technique is simple and efficient in ensuring the robustness of the closed loop system. Several simulation examples are presented, including the control of processes exhibiting integer and fractional order dynamics.
Parameter Estimation using Experimental Bifurcation DiagramsAndy Salmon
The document discusses parameter estimation of an aerodynamic model from experimental bifurcation diagrams. It summarizes an experiment that captured dynamic characteristics of a scale aircraft model, observing a post-stall pitch oscillation. The author proposes a novel method of parameter estimation using bifurcation analysis rather than time domain analysis to estimate dynamic parameters governing the limit cycle oscillation. However, problems were encountered in that the equations of motion were incomplete and inaccuracies in a numerical simulation prevented successful implementation of the bifurcation-based estimation method. Further work is needed to fully understand and model the system to allow the proposed approach.
An Information-Theoretic Approach for Clonal Selection AlgorithmsMario Pavone
This document presents an information-theoretic approach for clonal selection algorithms (CSA) used to solve global optimization problems. It introduces CSA as biologically-inspired algorithms, describes the key operators of cloning, hypermutation, and aging. It also discusses using Kullback-Leibler, Rényi, and Von Neumann divergences to analyze the learning process of CSA. The document outlines applying CSA to benchmark optimization functions and comparing results.
Online Detection of Shutdown Periods in Chemical Plants: A Case StudyManuel Martín
In process industry, chemical processes are controlled and monitored by using readings from multiple physical sensors across the plants. Such physical sensors are also supplemented by soft sensors, i.e. adaptive predictive models, which are often used for computing hard-to-measure variables of the process. For soft sensors to work well and adapt to changing operating conditions they need to be provided with relevant data. As production plants are regularly stopped, data instances generated during shutdown periods have to be identified to avoid updating these predictive models with wrong data. We present a case study concerned with a large chemical plant operation over a 2 years period. The task is to robustly and accurately identify the shutdown periods even in case of multiple sensor failures. State-of-the-art methods were evaluated using the first half of the dataset for calibration purposes and the other half for measuring the performance. Results show that shutdowns (i.e. sudden changes) can be quickly detected in any case but the detection delay of startups (i.e. gradual changes) is directly related with the choice of a window size.
Eric Katzen worked as an intern at NASA's Hazardous Gas Detection Laboratory (HGDL) under the mentorship of electrical engineer Reggie Martin. Some of his responsibilities included developing leak test equipment, procedures, and calibrating helium mass spectrometers. Specifically, he developed a Leak Tester Cart to test for leak rates, amended HGDL procedures to conform with industry standards, calibrated helium mass spectrometers, tested fittings for maximum leak rates, and created a webpage organizing leak detection specifications and standards. Through his work, he gained valuable experience in leak detection techniques and helping improve HGDL testing operations.
This paper applies inverse transform sampling to sample training points for surrogate models. Inverse transform sampling uniformly generates a sequence of real numbers ranging from 0 to 1 as the probabilities at sample points. The coordinates of the sample points are evaluated using the inverse functions of Cumulative Distribution Functions (CDF). The inputs to surrogate models are assumed to be independent random variables. The sample points obtained by inverse transform sampling can effectively represent the frequency of occurrence of the inputs. The distributions of inputs to the surrogate models are fitted to their observed data. These distributions are used for inverse transform sampling. The sample points have larger densities in the regions where the Probability Density Functions (PDF) are higher. This sampling approach ensures that the regions with higher densities of sample points are more prevalent in the observations of the random variables. Inverse transform sampling is applied to the development of surrogate models for window performance evaluation. The distributions of the following three climatic conditions are fitted: (i) the outside temperature, (ii) the wind speed, and (iii) the solar radiation. The sample climatic conditions obtained by the inverse transform sampling are used as training points to evaluate the heat transfer through a generic triple pane window. Using the simulation results at the sample points, surrogate models are developed to represent the heat transfer through the window as a function of the climatic conditions. It is observed that surrogate models developed using the inverse transform sampling can provide higher accuracy than that developed using the Sobol sequence directly for the window performance evaluation.
A first order hyperbolic framework for large strain computational computation...Jibran Haider
An explicit Total Lagrangian momentum-strains mixed formulation in the form of a system of first order hyperbolic conservation laws, has recently been published to overcome the shortcomings posed by the traditional second order displacement based formulation when using linear tetrahedral elements.
The formulation, where the linear momentum and the deformation gradient are treated as unknown variables, has been implemented within the cell centred finite volume environment in OpenFOAM. The numerical solutions have performed extremely well in bending dominated nearly incompressible scenarios without the appearance of any spurious pressure modes, yielding an equal order of convergence for velocities and stresses.
To have more insight into my research, please visit my website:
http://jibranhaider.weebly.com/
This document summarizes a proposed system risk analysis method using bearing sensor data. It includes:
1. An introduction describing condition-based maintenance using bearing sensor data to detect anomalies.
2. A proposed method using a convolutional autoencoder (CAE) for feature extraction from bearing signals followed by a T2 control chart and EWMA for fault detection. Only normal bearing data is used to train these models.
3. An experiment applying the method to data from a bearing test rig, comparing normal and outer race fault bearings. The CAE and statistical techniques are evaluated on their ability to detect faults in test data.
This document provides an overview of risk management and quality control using statistical process control charts. It discusses [1] managing quality risk through control charts, [2] different types of risks including material, consequential, social, legal, and political risks, and [3] best practices for risk management including policies, methodologies, and resources. The document also covers control chart fundamentals, calculating control limits, identifying assignable causes, and process improvement.
Prediction of the time to complete a series of surgical cases to avoid OR ove...Rene Alvarez
This study aimed to develop a methodology to accurately predict the time needed to complete a series of surgical cases in order to avoid overutilization of operating room time. The researchers analyzed data on 6,090 cardiac surgeries performed between 2004-2009. They fitted lognormal distributions to surgical times and developed a method based on the Fenton-Wilkinson approximation to estimate the total time of a scheduled series of cases. When tested on 95 actual schedules over 3 months in 2009, the methodology accurately predicted the risk of overtime in most cases and helped minimize overutilization of operating room time.
The TRASGO project aims to develop an innovative cosmic ray detector based on timing RPCs. The detector, called TRASGO, will be able to measure particle timing, tracking, and identification. It will consist of timing RPC planes with 100ps time resolution, a fast tracking algorithm called TimTrack, and a particle identification method called MIDAS. An array of 10-50 TRASGO detectors called MEIGA will be installed to study cosmic rays around the knee and test simulation packages. The MEIGA collaboration has been formed between universities in Spain and Portugal to develop the detectors and carry out the cosmic ray measurements.
2009 ferrara, congresso regionale, i tools da raggiungere nell'ablazione dell...Centro Diagnostico Nardi
1) The document discusses tools and techniques for achieving pulmonary vein isolation (PVI) to treat atrial fibrillation, including efficacy and safety data from multiple studies and techniques.
2) Mapping and ablation technologies have advanced, including 3D mapping systems, cryoballoon ablation, and multi-electrode catheters, improving identification of arrhythmogenic substrates and tailored lesion formation.
3) Large surveys of AF ablation outcomes show success rates without antiarrhythmic drugs of 74.9-84% for paroxysmal AF, 74.8% for persistent AF, and 71% for permanent AF, with overall complication rates of 4.54%. Advancing technologies may further improve results.
This document summarizes research on automatically classifying frog calls using wireless sensor networks and machine learning techniques. The researchers extracted features like MFCCs and wavelet coefficients from frog vocalizations and used k-NN and genetic algorithms to select an optimal feature subset and classify four frog species. Their results showed MFCCs achieved higher classification accuracy compared to wavelet features and that 8 MFCCs provided an optimized tradeoff between performance and computational cost for use on wireless sensor nodes.
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
1) The document describes a segmentation algorithm for polarimetric SAR (PolSAR) data that can model both scalar-texture and multi-texture scattering.
2) The algorithm uses log-cumulants and hypothesis testing to determine whether a scalar-texture or dual-texture model best fits the data within each segment.
3) The algorithm is tested on simulated multi-texture PolSAR data and is shown to accurately segment the classes and estimate their texture parameters. However, when applied to real data sets, the algorithm only finds the simpler scalar-texture case.
More Related Content
Similar to An Autocorrelation Analysis Approach to Detecting Land Cover Change using Hyper-Temporal Time-Series Data.pdf
New Software Methods Enhance Sedimentation Velocity Analysis of Protein Aggre...KBI Biopharma
1) New software methods have improved the resolution and sensitivity of sedimentation velocity analysis, making it useful for comparability testing, formulation development, and quality control.
2) A method developed by Peter Schuck increases resolution of multiple components and detects very minor components below 1%. It provides higher resolution than SEC columns.
3) Another new method allows obtaining conformation and mass information at extremely low concentrations below 1 microgram total protein.
Measuring Comparability of Conformation, Heterogeneity and Aggregation with C...KBI Biopharma
"Measuring Comparability of Conformation, Heterogeneity, and Aggregation with Circular Dichroism and Analytical Ultracentrifugation", invited talk, State of the Art Methods for the Characterization of Biological Products and Assessment of Comparability, NIH, June 2003
This document provides an overview of small field reference dosimetry. It discusses the challenges of small field dosimetry due to lack of lateral electron equilibrium and detector size limitations. It describes the IAEA framework for reference dosimetry of small and non-standard fields, including determining equivalent square fields and reference conditions. Practical procedures for determining output factors and reference dosimetry for devices like the Cyberknife and Gammaknife are outlined. The importance of relative and in-vivo dosimetry for complex treatments using small fields is also covered.
Cooper Environmental’s Xact® 625i is designed for high time resolution
multi-metals monitoring of ambient air, with detection limits that rival
those of laboratory analysis. The Xact® 625i comes standard with a
solid-state meteorological sensor and Cooper Environmental’s proprietary ADAPT analysis package, making the instrument one of the most
powerful air pollution source detection offerings in the industry
1. The document proposes a novel adaptive optics approach combining conventional and sensorless adaptive optics to optimize coupling between an adaptive optics system and single mode fibers.
2. It describes a laboratory experiment demonstrating this approach, achieving 56% coupling efficiency by correcting non-common path aberrations in static mode and maintaining stability during turbulence.
3. The approach was also tested on a 1.5m telescope, improving starlight coupling from 11% to 63% by applying the sensorless correction in static mode.
This document summarizes research on enhancing gene expression programming (GEP) for Reynolds-averaged Navier-Stokes equations turbulence modeling with unsupervised clustering. It presents a GEP-enhanced multi-model framework that uses feature selection, dimensionality reduction, and clustering to assign different turbulence models to distinct regions of a flow, improving simulation accuracy. Results show the approach produces more accurate mean velocities and Reynolds stresses for a body-of-revolution testcase compared to baseline and GEP-driven models. Ongoing work includes optimizing the framework configuration and extending it to 3D domains.
A novel auto-tuning method for fractional order PID controllersISA Interchange
Fractional order PID controllers benefit from an increasing amount of interest from the research community due to their proven advantages. The classical tuning approach for these controllers is based on specifying a certain gain crossover frequency, a phase margin and a robustness to gain variations. To tune the fractional order controllers, the modulus, phase and phase slope of the process at the imposed gain crossover frequency are required. Usually these values are obtained from a mathematical model of the process, e.g. a transfer function. In the absence of such model, an auto-tuning method that is able to estimate these values is a valuable alternative. Auto-tuning methods are among the least discussed design methods for fractional order PID controllers. This paper proposes a novel approach for the auto-tuning of fractional order controllers. The method is based on a simple experiment that is able to determine the modulus, phase and phase slope of the process required in the computation of the controller parameters. The proposed design technique is simple and efficient in ensuring the robustness of the closed loop system. Several simulation examples are presented, including the control of processes exhibiting integer and fractional order dynamics.
Parameter Estimation using Experimental Bifurcation DiagramsAndy Salmon
The document discusses parameter estimation of an aerodynamic model from experimental bifurcation diagrams. It summarizes an experiment that captured dynamic characteristics of a scale aircraft model, observing a post-stall pitch oscillation. The author proposes a novel method of parameter estimation using bifurcation analysis rather than time domain analysis to estimate dynamic parameters governing the limit cycle oscillation. However, problems were encountered in that the equations of motion were incomplete and inaccuracies in a numerical simulation prevented successful implementation of the bifurcation-based estimation method. Further work is needed to fully understand and model the system to allow the proposed approach.
An Information-Theoretic Approach for Clonal Selection AlgorithmsMario Pavone
This document presents an information-theoretic approach for clonal selection algorithms (CSA) used to solve global optimization problems. It introduces CSA as biologically-inspired algorithms, describes the key operators of cloning, hypermutation, and aging. It also discusses using Kullback-Leibler, Rényi, and Von Neumann divergences to analyze the learning process of CSA. The document outlines applying CSA to benchmark optimization functions and comparing results.
Online Detection of Shutdown Periods in Chemical Plants: A Case StudyManuel Martín
In process industry, chemical processes are controlled and monitored by using readings from multiple physical sensors across the plants. Such physical sensors are also supplemented by soft sensors, i.e. adaptive predictive models, which are often used for computing hard-to-measure variables of the process. For soft sensors to work well and adapt to changing operating conditions they need to be provided with relevant data. As production plants are regularly stopped, data instances generated during shutdown periods have to be identified to avoid updating these predictive models with wrong data. We present a case study concerned with a large chemical plant operation over a 2 years period. The task is to robustly and accurately identify the shutdown periods even in case of multiple sensor failures. State-of-the-art methods were evaluated using the first half of the dataset for calibration purposes and the other half for measuring the performance. Results show that shutdowns (i.e. sudden changes) can be quickly detected in any case but the detection delay of startups (i.e. gradual changes) is directly related with the choice of a window size.
Eric Katzen worked as an intern at NASA's Hazardous Gas Detection Laboratory (HGDL) under the mentorship of electrical engineer Reggie Martin. Some of his responsibilities included developing leak test equipment, procedures, and calibrating helium mass spectrometers. Specifically, he developed a Leak Tester Cart to test for leak rates, amended HGDL procedures to conform with industry standards, calibrated helium mass spectrometers, tested fittings for maximum leak rates, and created a webpage organizing leak detection specifications and standards. Through his work, he gained valuable experience in leak detection techniques and helping improve HGDL testing operations.
This paper applies inverse transform sampling to sample training points for surrogate models. Inverse transform sampling uniformly generates a sequence of real numbers ranging from 0 to 1 as the probabilities at sample points. The coordinates of the sample points are evaluated using the inverse functions of Cumulative Distribution Functions (CDF). The inputs to surrogate models are assumed to be independent random variables. The sample points obtained by inverse transform sampling can effectively represent the frequency of occurrence of the inputs. The distributions of inputs to the surrogate models are fitted to their observed data. These distributions are used for inverse transform sampling. The sample points have larger densities in the regions where the Probability Density Functions (PDF) are higher. This sampling approach ensures that the regions with higher densities of sample points are more prevalent in the observations of the random variables. Inverse transform sampling is applied to the development of surrogate models for window performance evaluation. The distributions of the following three climatic conditions are fitted: (i) the outside temperature, (ii) the wind speed, and (iii) the solar radiation. The sample climatic conditions obtained by the inverse transform sampling are used as training points to evaluate the heat transfer through a generic triple pane window. Using the simulation results at the sample points, surrogate models are developed to represent the heat transfer through the window as a function of the climatic conditions. It is observed that surrogate models developed using the inverse transform sampling can provide higher accuracy than that developed using the Sobol sequence directly for the window performance evaluation.
A first order hyperbolic framework for large strain computational computation...Jibran Haider
An explicit Total Lagrangian momentum-strains mixed formulation in the form of a system of first order hyperbolic conservation laws, has recently been published to overcome the shortcomings posed by the traditional second order displacement based formulation when using linear tetrahedral elements.
The formulation, where the linear momentum and the deformation gradient are treated as unknown variables, has been implemented within the cell centred finite volume environment in OpenFOAM. The numerical solutions have performed extremely well in bending dominated nearly incompressible scenarios without the appearance of any spurious pressure modes, yielding an equal order of convergence for velocities and stresses.
To have more insight into my research, please visit my website:
http://jibranhaider.weebly.com/
This document summarizes a proposed system risk analysis method using bearing sensor data. It includes:
1. An introduction describing condition-based maintenance using bearing sensor data to detect anomalies.
2. A proposed method using a convolutional autoencoder (CAE) for feature extraction from bearing signals followed by a T2 control chart and EWMA for fault detection. Only normal bearing data is used to train these models.
3. An experiment applying the method to data from a bearing test rig, comparing normal and outer race fault bearings. The CAE and statistical techniques are evaluated on their ability to detect faults in test data.
This document provides an overview of risk management and quality control using statistical process control charts. It discusses [1] managing quality risk through control charts, [2] different types of risks including material, consequential, social, legal, and political risks, and [3] best practices for risk management including policies, methodologies, and resources. The document also covers control chart fundamentals, calculating control limits, identifying assignable causes, and process improvement.
Prediction of the time to complete a series of surgical cases to avoid OR ove...Rene Alvarez
This study aimed to develop a methodology to accurately predict the time needed to complete a series of surgical cases in order to avoid overutilization of operating room time. The researchers analyzed data on 6,090 cardiac surgeries performed between 2004-2009. They fitted lognormal distributions to surgical times and developed a method based on the Fenton-Wilkinson approximation to estimate the total time of a scheduled series of cases. When tested on 95 actual schedules over 3 months in 2009, the methodology accurately predicted the risk of overtime in most cases and helped minimize overutilization of operating room time.
The TRASGO project aims to develop an innovative cosmic ray detector based on timing RPCs. The detector, called TRASGO, will be able to measure particle timing, tracking, and identification. It will consist of timing RPC planes with 100ps time resolution, a fast tracking algorithm called TimTrack, and a particle identification method called MIDAS. An array of 10-50 TRASGO detectors called MEIGA will be installed to study cosmic rays around the knee and test simulation packages. The MEIGA collaboration has been formed between universities in Spain and Portugal to develop the detectors and carry out the cosmic ray measurements.
2009 ferrara, congresso regionale, i tools da raggiungere nell'ablazione dell...Centro Diagnostico Nardi
1) The document discusses tools and techniques for achieving pulmonary vein isolation (PVI) to treat atrial fibrillation, including efficacy and safety data from multiple studies and techniques.
2) Mapping and ablation technologies have advanced, including 3D mapping systems, cryoballoon ablation, and multi-electrode catheters, improving identification of arrhythmogenic substrates and tailored lesion formation.
3) Large surveys of AF ablation outcomes show success rates without antiarrhythmic drugs of 74.9-84% for paroxysmal AF, 74.8% for persistent AF, and 71% for permanent AF, with overall complication rates of 4.54%. Advancing technologies may further improve results.
This document summarizes research on automatically classifying frog calls using wireless sensor networks and machine learning techniques. The researchers extracted features like MFCCs and wavelet coefficients from frog vocalizations and used k-NN and genetic algorithms to select an optimal feature subset and classify four frog species. Their results showed MFCCs achieved higher classification accuracy compared to wavelet features and that 8 MFCCs provided an optimized tradeoff between performance and computational cost for use on wireless sensor nodes.
Similar to An Autocorrelation Analysis Approach to Detecting Land Cover Change using Hyper-Temporal Time-Series Data.pdf (20)
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
1) The document describes a segmentation algorithm for polarimetric SAR (PolSAR) data that can model both scalar-texture and multi-texture scattering.
2) The algorithm uses log-cumulants and hypothesis testing to determine whether a scalar-texture or dual-texture model best fits the data within each segment.
3) The algorithm is tested on simulated multi-texture PolSAR data and is shown to accurately segment the classes and estimate their texture parameters. However, when applied to real data sets, the algorithm only finds the simpler scalar-texture case.
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
This document discusses using wavelet transforms to analyze two-point statistics of polarimetric synthetic aperture radar (PolSAR) data. It introduces wavelet variance and kurtosis as metrics that can be applied to PolSAR data transformed using a wavelet frame. It then provides an example of applying this analysis to ALOS PALSAR data over Hawaii's Papau Seamount to characterize sea surface features.
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
The Sentinel-1 mission is part of the GMES program and consists of two satellites to provide C-band SAR data for emergency response, marine and land monitoring, and other applications. The satellites operate in a near-polar orbit with a 12 day repeat cycle. The main acquisition mode is an interferometric wide swath mode with 5m range and 20m azimuth resolution over a 250km swath. Sentinel-1 will support operational services and create a long-term SAR data archive.
The document summarizes the status of the GMES Space Component program. It describes the Sentinel satellite missions for monitoring land, ocean, atmosphere and emergency situations. The Sentinels will provide long-term data continuity as well as improved coverage compared to existing missions. Sentinel data will be freely and openly available to both operational users and the science community. The program is on track, with the first Sentinel launches beginning in 2013.
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
The document describes the progress of the development of CFOSAT SCAT, a Ku-band scatterometer onboard the Chinese-French Oceanography Satellite (CFOSAT). CFOSAT will measure global ocean surface winds and waves to improve weather forecasting, ocean dynamics modeling, climate research, and understanding of surface processes. The SCAT instrument is a rotating fan-beam radar scatterometer that will retrieve wind vectors using measurements of backscatter at incidence angles from 26 to 46 degrees. It has a wide swath of over 1000km and specifications are designed to achieve high-precision wind measurements globally. System details including parameters and the operation mode are provided.
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
The document describes the SAP4PRISMA project which aims to develop algorithms and products to support the Italian hyperspectral PRISMA Earth observation mission. The project will focus on data processing, quality assessment, classification methods, and generating level 3 and 4 products for applications like land monitoring, agriculture, and hazard monitoring. It will include the generation of "PRISMA-like" synthetic test data to support algorithm development and validation. The research will be carried out across multiple work packages focusing on topics like data quality, classification methods, calibration/validation, and developing applicative products.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different sites, including a Zambian woodland and North Carolina forest sites.
3) Time series of Hyperion data at flux tower sites show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different forest, grassland, and woodland sites globally.
3) Time series of Hyperion data at sites in Zambia, North Carolina, and Kansas show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release measured by flux towers.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
EO-1/Hyperion has been collecting hyperspectral imagery for over 12 years, acquiring over 65,000 scenes. Researchers have been using these data to develop and validate algorithms for estimating vegetation properties like fraction of absorbed photosynthetically active radiation (fAPAR) and photochemical reflectance index (PRI). Comparisons of Hyperion data to field measurements at flux tower sites show these algorithms can accurately track vegetation changes over time and relate spectral properties to productivity metrics like light use efficiency and gross ecosystem productivity. This work is helping prototype data products for the upcoming HyspIRI mission.
This document is a return and exchange form for a wetsuit company. It provides instructions for customers to fill out when returning an undamaged item for a refund, exchange, or size change. The form requests information like the customer's order details, contact information, the suit being returned and its size, the reason for return, and if applicable, the new desired size. It also provides the return shipping address and notifies customers that the company is not responsible for lost or damaged return packages.
This document provides instructions for clients of Fox Tax Planning and Preparation for preparing to have their taxes filed. It lists important income and deduction documentation to bring to an appointment, such as W-2s, 1099s, receipts for donations. It also includes an engagement letter detailing the services to be provided, responsibilities of both parties, fees, and electronic filing and signature procedures. Clients are asked to sign the letter agreeing to the terms and return it along with their tax information.
The document discusses mapping wetlands in North America using MODIS 500m imagery. It describes wetlands and existing global wetland databases. The methodology uses MODIS data from 2008, digital elevation models, and reference data to classify wetlands into three types - forest/shrub dominant wetlands, herbaceous dominant wetlands, and sea grass dominant wetlands. Training data is collected from existing land cover maps and Landsat imagery. A decision tree model and maximum likelihood classification are applied to extract wetlands from other land covers.
The document summarizes research using SBAS-DInSAR (Small BAseline Subset differential interferometric synthetic aperture radar) techniques to analyze ground deformation at Mt. Etna volcano in Italy over the last 18 years using ERS and ENVISAT satellite data. The analysis revealed three main deformation processes: inflation of the volcanic edifice, subsidence of sectors on the eastern flank due to gravitational spreading, and deflation-inflation cycles associated with eruptive and post-eruptive activity. More recent analysis using higher resolution COSMO-SkyMed data from 2009-2010 detected deformation related to faults and a 2010 earthquake more precisely than lower resolution ENVISAT data.
An Autocorrelation Analysis Approach to Detecting Land Cover Change using Hyper-Temporal Time-Series Data.pdf
1. An Autocorrelation Analysis Approach to
Detecting Land Cover Change using
Hyper-Temporal Time-Series Data
W. Kleynhans 1,2 B.P. Salmon 1,2 J.C. Olivier 1 K.J. Wessels 2
F. van den Bergh 2
1 Electrical and Electronic Engineering, University of Pretoria, South Africa
2 Remote Sensing Research Unit, Meraka Institute, CSIR, Pretoria, South Africa
July 25, 2011
2. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za
3. Introduction
Remote sensing satellite data - effective way to monitor and
evaluate land cover changes
Bi-temporal change detection approach is not always
appropriate.
The temporal frequency should be high enough to distinguish
change events from natural phenological cycles.
Having a series of equally sampled acquisitions opens the door
for the utilization basic signal processing methods.
Temporal autocorrelation was effectively used as an input
feature for a change detection alarm.
www.csir.co.za
4. Problem statement
Settlement expansion is the most pervasive form of land cover
change in South Africa.
Objective is to develop a change alarm that is able to detect
the formation of new settlement developments.
These changes typically occur in areas that are naturally
vegetated.
Change events should be distinguished from natural
phenological cycles.
www.csir.co.za
5. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za
6. Study Area
The Gauteng province is located in northern South Africa and
covers a total area of approximately 17000 km2
592 natural vegetation and 372 settlement no-change MODIS
pixels were identified.
181 change MODIS pixels were identified.
Each pixel has a 7 year time-series (8 day composited:
2001-2007).
www.csir.co.za
11. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za
12. Temporal ACF method
Assume time series in vector form:
X = Xn , n ∈ {1, 2, ..., N} (1)
The normalized ACF for time-series X:
E [(Xn − mean(X))(Xn+τ − mean(X))]
RXX (τ ) = (2)
var(X)
Stationarity assumption: mean(X) and var(X) should be constant!
Reflection time-series from nature very rarely adhere to this
assumption but some do more than others...
www.csir.co.za
15. Use of simulated change to determine detection parameters
Time-series of no change areas are relatively easy to obtain
but examples of real change is much more difficult.
Solution: Obtain only examples of no-change time-series and
simulate the change time-series using time-series blending.
Using simulated change the start and rate of change can be
controlled.
Start of change date was random using a uniform distribution
Duration of change was varied between 6 and 24 months.
Using a no-change and simulated change dataset, the optimal
band lag and threshold is determined (“off-line” optimization
phase)
www.csir.co.za
16. Use of simulated change to determine detection parameters
0.9
Band 1
0.85 Band 2
Band 3
0.8 Band 4
Band 5
0.75
Band 6
Band 7
NDVI
A
0.7
O
0.65
0.6
0.55
0.5
40 80 120 160 200 240 280 320 360
Time Lag (Days)
www.csir.co.za
17. Opperational phase
A pixel is labeled as having changed by evaluating the following,
true if R b (τ ) ≥ δτ
b∗
Change =
false if R b (τ ) < δτ ,
b∗
www.csir.co.za
18. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za
19. Results using simulated change
Confusion Matrix, overall accuracy (OA ) and optimal threshold
(δ ∗ ) showing the best land cover change detection performance
during the off-line optimization phase using MODIS band 4 550 nm
with a lag of 96 days
Simulated No Change δ∗ OA
change (n=482)
(n=592)
Change Detected 75.17% 14.73% 0.16 80.22%
No Change Detected 24.83% 85.27%
www.csir.co.za
20. Results using real change
Confusion Matrix, overall accuracy (OA ) and threshold (δ) for the
case of real change detection using the MODIS band 4 (550 nm)
with a lag of 96 days as determined during the off-line optimization
phase
Real change No Change δ OA
(n=181) (n=482)
Change Detected 92.27% 15.35% 0.16 88.46%
No Change Detected 7.73% 84.65%
www.csir.co.za
21. Results : Timing of the change
Mean start of change OA
2001/06 70.67%
2002/06 83.57%
2003/06 85.33%
2004/06 85.43%
2005/06 84.92%
2006/06 81.74%
2007/06 76.66%
www.csir.co.za
22. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za
23. Concluding remarks
A change metric can be formulated by selecting the lag that
shows the highest separability between the ACF of no-change
and simulated change time-series examples.
The optimal band lag and threshold selection is determined in
an “off-line” optimization phase.
After the detection parameters are determined, the method
was run blindly over the study area.
The method shows a slight decrease in performance if the
start of change is in the first or last year.
www.csir.co.za
24. Outline
Introduction
Study Area
Methodology
Results
Conclusion
Examples
www.csir.co.za