This document discusses a method for determining aerosol optical depth from satellite observations. It summarizes:
1) Aerosols can be characterized by analyzing the spatial, spectral, temporal, and directional properties of observations from satellites.
2) The method separates the aerosol signal from reflections from the surface using a model that considers different timescales for land and atmospheric changes.
3) The top-of-atmosphere reflectance is modeled as the sum of surface reflectance, gas absorption, molecular scattering, and aerosol scattering terms.
The document contains images from the Mars Reconnaissance Orbiter of various geological features on Mars, including Bonneville Crater, a Martian dust devil, araneiform features caused by sublimating carbon dioxide, a volcanic vent, and unexplained surface streaks that may have been caused by flowing water.
Anisotropic Avo Analysis for Reservoir Characterization in Derby Field Southe...IOSRJAP
Anisotropic AVO analysis for reservoir characterization in Derby field southeastern Niger delta has been investigated. The objective of this study was to investigate the AVO response of shale over gas sandsonisotropic and anisotropic synthetic models and the real CDP gathers. This was done by plotting amplitude of reflections versus offsets and carrying out AVO intercept-gradient analysison HD1horizon. The results of the models were compared with the real CDP gather to deduce the robust synthetic model for efficient AVO analysis in the field. Well, CDP processed gathers and Hampson-Russell GEOVIEW and AVO module were used for the analysis. Results revealed that the plots of amplitude versus offset of the isotropic and anisotropic synthetic models agree at near offset but show opposite AVO response due to increasing contributions of seismic anisotropy at far offsets. The results of the anisotropic synthetic model correlates well with the CDP gathers indicating that seismic anisotropy is an important factor in AVO analysis. Results also show that a bright spot indicative of gas charged sands was delineated for the isotropic and anisotropic models but with different classes of AVO anomalies andproducts. The isotropic synthetic model show a class II AVO anomaly with a positive AVO product, while the anisotropic model show a class IV AVO anomaly with a negative AVO product comparable to the real CDP gather.These analysesclearly show that description of rock without anisotropy especially, seismic anisotropy is incomplete since most rocks are not completely isotropic. Therefore, accounting for seismic anisotropy in AVO synthetic modelling will ensure that the correct earth model is predicted and the reservoir is adequately characterized.
Oslo university basic well log analysis introductionJavier Espinoza
The document provides an overview of basic well log analysis methods used to derive petrophysical properties for hydrocarbon exploration. It discusses the borehole environment, including invasion of drilling mud into formations. It also covers open and cased hole logs, the three main types of logs (electrical, nuclear, acoustic), and how logs are used to infer properties like lithology, porosity, permeability, water saturation, and resistivity. Key concepts discussed include Archie's law, borehole resistivity profiles, and correcting mud and water resistivities for formation temperature.
science behind well logging_dileep p allavarapuknigh7
This document provides an overview of well logging exercises for students. It discusses the importance of various disciplines in exploration and production such as reservoir modeling, drilling and logging, geology and seismic. The ultimate goal of well log interpretation is to calculate water saturation, hydrocarbon saturation and effective porosity in reservoir rock units. Key points are provided about common rock types like sandstone, shale and limestone. The document reviews important well logs like gamma ray, resistivity, neutron and density and how they can be used to interpret lithology, porosity, fluid content and saturation. Formulas like Archie's equation and Indonesian equation are also summarized. Contact information is provided for institutions where students can learn more.
The spontaneous potential (SP) log measures naturally occurring electrical voltages in formations and is used to determine lithology and permeability. Specifically, SP response is created by differences in salinity between mud filtrate and formation waters in permeable beds. SP can be used to detect permeable beds, determine formation water resistivity, and calculate shale volume. Key factors that influence the SP measurement include the ratio of mud filtrate to formation water resistivity and the presence of hydrocarbons or shale.
well logging tools and exercise_dileep p allavarapuknigh7
Logging is a process that provides comprehensive formation information through continuously recording parameter measurements with depth. It plays an important role in exploration and production by obtaining resistivity, porosity, and lithology logs to identify hydrocarbon-bearing zones. Different disciplines like drilling, logging, core analysis, and reservoir modeling are interrelated and provide both open and cased hole data. Logs are interpreted to calculate parameters like water saturation, hydrocarbon saturation, and effective porosity, with the goal of determining hydrocarbon saturation multiplied by effective porosity in reservoir units. Accurate interpretation requires integration of log data with core analysis and rock physics studies.
Instrument to measure the bidirectional reflectanceajsatienza
This instrument measures the bidirectional reflectance distribution function (BRDF) of surfaces with the following properties:
1. It measures the BRDF for eight illumination angles from 0 to 65 degrees, three colors (475, 570, 658 nm), and over 100 selected viewing angles.
2. The viewing zenith angles range from 5 to 65 degrees, and the azimuth angles range from 0 to ±180 degrees relative to the illumination direction.
3. Tests show it can measure the BRDF of flat surfaces with a precision of 1-5% and an accuracy of 10% of the measured reflectance.
This document discusses principles of well logging. It describes how well logging aims to evaluate subsurface hydrocarbon accumulations through measuring properties in boreholes. It outlines different types of hydrocarbon traps and elements in a petroleum system. It then explains what a well log is and different types of logs used, including gamma ray, resistivity, sonic, and neutron logs. Gamma ray logs specifically measure natural radioactivity to distinguish between lithologies like sandstone and shale. The document provides details on interpreting gamma ray logs and calculating shale volume from gamma ray readings.
The document contains images from the Mars Reconnaissance Orbiter of various geological features on Mars, including Bonneville Crater, a Martian dust devil, araneiform features caused by sublimating carbon dioxide, a volcanic vent, and unexplained surface streaks that may have been caused by flowing water.
Anisotropic Avo Analysis for Reservoir Characterization in Derby Field Southe...IOSRJAP
Anisotropic AVO analysis for reservoir characterization in Derby field southeastern Niger delta has been investigated. The objective of this study was to investigate the AVO response of shale over gas sandsonisotropic and anisotropic synthetic models and the real CDP gathers. This was done by plotting amplitude of reflections versus offsets and carrying out AVO intercept-gradient analysison HD1horizon. The results of the models were compared with the real CDP gather to deduce the robust synthetic model for efficient AVO analysis in the field. Well, CDP processed gathers and Hampson-Russell GEOVIEW and AVO module were used for the analysis. Results revealed that the plots of amplitude versus offset of the isotropic and anisotropic synthetic models agree at near offset but show opposite AVO response due to increasing contributions of seismic anisotropy at far offsets. The results of the anisotropic synthetic model correlates well with the CDP gathers indicating that seismic anisotropy is an important factor in AVO analysis. Results also show that a bright spot indicative of gas charged sands was delineated for the isotropic and anisotropic models but with different classes of AVO anomalies andproducts. The isotropic synthetic model show a class II AVO anomaly with a positive AVO product, while the anisotropic model show a class IV AVO anomaly with a negative AVO product comparable to the real CDP gather.These analysesclearly show that description of rock without anisotropy especially, seismic anisotropy is incomplete since most rocks are not completely isotropic. Therefore, accounting for seismic anisotropy in AVO synthetic modelling will ensure that the correct earth model is predicted and the reservoir is adequately characterized.
Oslo university basic well log analysis introductionJavier Espinoza
The document provides an overview of basic well log analysis methods used to derive petrophysical properties for hydrocarbon exploration. It discusses the borehole environment, including invasion of drilling mud into formations. It also covers open and cased hole logs, the three main types of logs (electrical, nuclear, acoustic), and how logs are used to infer properties like lithology, porosity, permeability, water saturation, and resistivity. Key concepts discussed include Archie's law, borehole resistivity profiles, and correcting mud and water resistivities for formation temperature.
science behind well logging_dileep p allavarapuknigh7
This document provides an overview of well logging exercises for students. It discusses the importance of various disciplines in exploration and production such as reservoir modeling, drilling and logging, geology and seismic. The ultimate goal of well log interpretation is to calculate water saturation, hydrocarbon saturation and effective porosity in reservoir rock units. Key points are provided about common rock types like sandstone, shale and limestone. The document reviews important well logs like gamma ray, resistivity, neutron and density and how they can be used to interpret lithology, porosity, fluid content and saturation. Formulas like Archie's equation and Indonesian equation are also summarized. Contact information is provided for institutions where students can learn more.
The spontaneous potential (SP) log measures naturally occurring electrical voltages in formations and is used to determine lithology and permeability. Specifically, SP response is created by differences in salinity between mud filtrate and formation waters in permeable beds. SP can be used to detect permeable beds, determine formation water resistivity, and calculate shale volume. Key factors that influence the SP measurement include the ratio of mud filtrate to formation water resistivity and the presence of hydrocarbons or shale.
well logging tools and exercise_dileep p allavarapuknigh7
Logging is a process that provides comprehensive formation information through continuously recording parameter measurements with depth. It plays an important role in exploration and production by obtaining resistivity, porosity, and lithology logs to identify hydrocarbon-bearing zones. Different disciplines like drilling, logging, core analysis, and reservoir modeling are interrelated and provide both open and cased hole data. Logs are interpreted to calculate parameters like water saturation, hydrocarbon saturation, and effective porosity, with the goal of determining hydrocarbon saturation multiplied by effective porosity in reservoir units. Accurate interpretation requires integration of log data with core analysis and rock physics studies.
Instrument to measure the bidirectional reflectanceajsatienza
This instrument measures the bidirectional reflectance distribution function (BRDF) of surfaces with the following properties:
1. It measures the BRDF for eight illumination angles from 0 to 65 degrees, three colors (475, 570, 658 nm), and over 100 selected viewing angles.
2. The viewing zenith angles range from 5 to 65 degrees, and the azimuth angles range from 0 to ±180 degrees relative to the illumination direction.
3. Tests show it can measure the BRDF of flat surfaces with a precision of 1-5% and an accuracy of 10% of the measured reflectance.
This document discusses principles of well logging. It describes how well logging aims to evaluate subsurface hydrocarbon accumulations through measuring properties in boreholes. It outlines different types of hydrocarbon traps and elements in a petroleum system. It then explains what a well log is and different types of logs used, including gamma ray, resistivity, sonic, and neutron logs. Gamma ray logs specifically measure natural radioactivity to distinguish between lithologies like sandstone and shale. The document provides details on interpreting gamma ray logs and calculating shale volume from gamma ray readings.
1. The document discusses various well logging tools and concepts used in petrophysical interpretation. It describes tools such as the spontaneous potential (SP) log, gamma ray (GR) log, resistivity logs including induction and lateral logs, and porosity logs.
2. Key concepts covered include the logging environment and factors that impact tool measurements like borehole conditions and mud properties. Interpretation techniques for evaluating permeable zones, formation resistivity, water saturation, and porosity are also summarized.
3. The document provides examples of using tools and concepts like the Archie formula to calculate water resistivity, determine hydrocarbon presence, and evaluate clean versus shaly formations. It also discusses corrections that must be applied to well log
Well log interpretation involves using well log data to estimate reservoir properties. It has been used since the 1920s to qualitatively identify hydrocarbons and is now a quantitative tool. A key figure was Gustavus Archie who in the 1940s established the field of petrophysics by relating well logs to core data. His work allowed properties like porosity, permeability and fluid saturation to be estimated. A presentation on well log interpretation outlined the workflow including editing logs, estimating properties like shale volume, porosity, permeability and fluid saturation, and presented two case studies analyzing different carbonate reservoirs.
Well Log Interpretation and Petrophysical Analisis in [Autosaved]Ridho Nanda Pratama
PT. Halliburton Logging Service is a branch of Halliburton that provides completion and production services, drilling, and reservoir evaluation to oil companies in Sumatra, Indonesia. Dery Marsan and Ridho Nanda Pratama completed an on-job training program at Halliburton from August to September 2015. Their project involved well log analysis to determine water saturation and the most suitable water resistivity parameters in two formations, with the objectives of identifying water zones, evaluating challenges around determining petrophysical parameters, and analyzing well data. Their analysis identified both water-bearing and possible oil-bearing zones through evaluation of gamma ray, resistivity, neutron-density crossplots, and other well logs.
This document provides an overview of well logging concepts and techniques. It discusses key well log formats and presentations, including standard tracks used for depth, resistivity, porosity, and other measurements. Common logs like SP, gamma ray, resistivity, neutron, and density are examined. Signatures indicating lithology, fluid content, and invasion are described. Proper interpretation requires understanding scales, crossovers caused by lithology changes, and matrix effects. An example of logs in a horizontal well is also provided.
This document provides an overview of induction logging techniques. It discusses the principles behind induction logging tools, including how they use transmitter and receiver coils to measure formation conductivity. It describes different coil configurations and focusing methods used to obtain measurements at various depths. The document also covers induction log corrections for effects like shoulder beds, borehole conditions, and skin effect. It provides an example induction log showing identification of thick and thin hydrocarbon zones.
The document provides information on reservoir mapping techniques and workflows. It discusses constructing structure maps, isopach maps, net pay maps, and fault maps to characterize reservoirs based on well log and seismic data. The maps are used for well placement, reserves calculations, and reservoir performance monitoring. Key steps include reservoir correlation, defining flow units, determining fluid contacts, and integrating geological and petrophysical data. The results provide insights into reservoir properties and geometry to promote optimal field development.
Brief review on Direct hydrocarbon indicators (DHI).
The presentation is a part from Seismic data interpretation course that i teach for undergraduates.
The sources are indicated in the references list.
Contact me via: hatem_refaat95@hotmail.com
The document discusses enhanced reservoir characterization using borehole images and dipmeter data. It begins with an overview of how logging tools have advanced from single measurements to detailed mapping of borehole walls using modern imaging tools with hundreds of thousands of data points per meter. The main topics covered include different types of dipmeter and imaging tools, generating borehole maps for orientation, stereographic projections for analyzing dip distributions, and processing raw data into geologically interpretable outputs like image and dip logs. Overall, the document outlines the transition from traditional well logging to digital geological mapping using high-resolution borehole wall data.
Presentation on how I realized my idea for a cloud service for organizing items, and how others can realize their own ideas using concepts and technologies on cloud computing.
This document appears to be notes from a presentation or workshop about learning styles and differentiation. It includes discussions of grouping students based on their preferred learning expressions, an activity where students create representations of differentiation using their least preferred style, and notes about various learning preferences and strategies.
This document provides an overview of starting an exclusive agency with Allstate. Key points:
- Allstate offers support like brand recognition, education programs, marketing materials and 24/7 claims service to help agents build their business.
- To get started, agents need available capital, property and casualty and life/health licenses, and NASD licenses. Qualities like success, self-motivation and entrepreneurial spirit are important.
- Agents can earn revenue through property and casualty commissions, Allstate financial commissions, bonuses for education, agency establishment and development based on tiers of performance levels.
- Allstate provides comprehensive education and support like branded retail environments, yellow pages advertising, executive advantage funds
1) The document discusses the opportunity for technology to improve organizational efficiency and transition economies into a "smart and clean world."
2) It argues that aggregate efficiency has stalled at around 22% for 30 years due to limitations of the Second Industrial Revolution, but that digitizing transport, energy, and communication through technologies like blockchain can help manage resources and increase efficiency.
3) Technologies like precision agriculture, cloud computing, robotics, and autonomous vehicles may allow for "dematerialization" and do more with fewer physical resources through effects like reduced waste and need for transportation/logistics infrastructure.
The document discusses radiometric corrections for remote sensing images. It describes how digital numbers are converted to top-of-atmosphere reflectance values using calibration coefficients and solar irradiance normalization. Atmospheric corrections are needed to estimate top-of-canopy reflectance and account for effects of gas absorption, scattering, and emission using a radiative transfer model like 6S. Parameters for the 6S model include viewing geometry, atmospheric properties, and spectral filter functions. Aerosol optical thickness can be obtained from Aeronet ground stations. Radiometric calibration is needed using reference reflectance panels.
2005-02-01 Co-Retrieval of Aerosol Color and Surface Color from SeaWiFS Satel...Rudolf Husar
The document discusses a method for simultaneously retrieving aerosol optical properties and aerosol-free surface reflectance from satellite images. SeaWiFS satellite data is used to co-retrieve the aerosol optical thickness and surface reflectance over land and ocean pixels. Iterative procedures are used to separate the effects of aerosols and surfaces on measured reflectance. Preliminary results over the northeastern US demonstrate the ability to map aerosol patterns and correct surface reflectance for aerosol effects.
2004-10-14 AIR-257: Satellite Detection of Aerosols Concepts and TheoryRudolf Husar
The document summarizes a course on detecting aerosols using satellites. The course covers:
1) Introduction to satellite aerosol detection, different satellite types and their usage.
2) How satellites detect aerosol events like fires, dust storms and haze.
3) Using satellite data to study aerosols for projects like analyzing their fast response to changing environmental conditions.
4) Applications of satellite data in air quality management and issues around its use.
1. The document analyzes aerosol measurements from Higashi-Osaka, Japan to classify aerosol types into six categories and correlate aerosol optical thickness (AOT) with particulate matter (PM).
2. Aerosols were classified using k-means clustering of AERONET data into categories like dust, biomass burning, and pollution. Approximate size distributions were proposed to characterize each category.
3. Correlating AOT and PM measurements improved PM2.5 estimation from AOT by considering anthropogenic versus dust aerosols separately.
4. Aerosol retrieval algorithms were developed using the proposed aerosol models and properties to interpret MODIS data for heavy
Aerosol retrieval using modis data & rt codeAhmad Mubin
This document summarizes information about aerosols including their sources, sizes, health effects, measurement techniques, and a methodology for satellite aerosol retrieval. It discusses using MODIS data at 500m resolution to estimate aerosol optical thickness over Hong Kong and compares the results to AERONET ground measurements. Key steps in the methodology include calculating top-of-atmosphere reflectance, accounting for Rayleigh scattering, surface reflectance, gas transmissions, and atmospheric effects to derive aerosol reflectance and optical thickness.
This document provides an overview of vibrational spectroscopy, specifically reflection absorption infrared spectroscopy (RAIRS). It discusses how RAIRS works by directing infrared radiation at a sample surface, analyzing the reflected beam to determine absorbed frequencies. RAIRS has excellent energy resolution and can study surface species and reactions under various conditions. It is most sensitive for observing adsorption of molecules with transition dipoles arranged along the surface normal. The document also covers instrumentation, theory, selection rules, examples of RAIRS analysis, and limitations.
1. The document discusses various well logging tools and concepts used in petrophysical interpretation. It describes tools such as the spontaneous potential (SP) log, gamma ray (GR) log, resistivity logs including induction and lateral logs, and porosity logs.
2. Key concepts covered include the logging environment and factors that impact tool measurements like borehole conditions and mud properties. Interpretation techniques for evaluating permeable zones, formation resistivity, water saturation, and porosity are also summarized.
3. The document provides examples of using tools and concepts like the Archie formula to calculate water resistivity, determine hydrocarbon presence, and evaluate clean versus shaly formations. It also discusses corrections that must be applied to well log
Well log interpretation involves using well log data to estimate reservoir properties. It has been used since the 1920s to qualitatively identify hydrocarbons and is now a quantitative tool. A key figure was Gustavus Archie who in the 1940s established the field of petrophysics by relating well logs to core data. His work allowed properties like porosity, permeability and fluid saturation to be estimated. A presentation on well log interpretation outlined the workflow including editing logs, estimating properties like shale volume, porosity, permeability and fluid saturation, and presented two case studies analyzing different carbonate reservoirs.
Well Log Interpretation and Petrophysical Analisis in [Autosaved]Ridho Nanda Pratama
PT. Halliburton Logging Service is a branch of Halliburton that provides completion and production services, drilling, and reservoir evaluation to oil companies in Sumatra, Indonesia. Dery Marsan and Ridho Nanda Pratama completed an on-job training program at Halliburton from August to September 2015. Their project involved well log analysis to determine water saturation and the most suitable water resistivity parameters in two formations, with the objectives of identifying water zones, evaluating challenges around determining petrophysical parameters, and analyzing well data. Their analysis identified both water-bearing and possible oil-bearing zones through evaluation of gamma ray, resistivity, neutron-density crossplots, and other well logs.
This document provides an overview of well logging concepts and techniques. It discusses key well log formats and presentations, including standard tracks used for depth, resistivity, porosity, and other measurements. Common logs like SP, gamma ray, resistivity, neutron, and density are examined. Signatures indicating lithology, fluid content, and invasion are described. Proper interpretation requires understanding scales, crossovers caused by lithology changes, and matrix effects. An example of logs in a horizontal well is also provided.
This document provides an overview of induction logging techniques. It discusses the principles behind induction logging tools, including how they use transmitter and receiver coils to measure formation conductivity. It describes different coil configurations and focusing methods used to obtain measurements at various depths. The document also covers induction log corrections for effects like shoulder beds, borehole conditions, and skin effect. It provides an example induction log showing identification of thick and thin hydrocarbon zones.
The document provides information on reservoir mapping techniques and workflows. It discusses constructing structure maps, isopach maps, net pay maps, and fault maps to characterize reservoirs based on well log and seismic data. The maps are used for well placement, reserves calculations, and reservoir performance monitoring. Key steps include reservoir correlation, defining flow units, determining fluid contacts, and integrating geological and petrophysical data. The results provide insights into reservoir properties and geometry to promote optimal field development.
Brief review on Direct hydrocarbon indicators (DHI).
The presentation is a part from Seismic data interpretation course that i teach for undergraduates.
The sources are indicated in the references list.
Contact me via: hatem_refaat95@hotmail.com
The document discusses enhanced reservoir characterization using borehole images and dipmeter data. It begins with an overview of how logging tools have advanced from single measurements to detailed mapping of borehole walls using modern imaging tools with hundreds of thousands of data points per meter. The main topics covered include different types of dipmeter and imaging tools, generating borehole maps for orientation, stereographic projections for analyzing dip distributions, and processing raw data into geologically interpretable outputs like image and dip logs. Overall, the document outlines the transition from traditional well logging to digital geological mapping using high-resolution borehole wall data.
Presentation on how I realized my idea for a cloud service for organizing items, and how others can realize their own ideas using concepts and technologies on cloud computing.
This document appears to be notes from a presentation or workshop about learning styles and differentiation. It includes discussions of grouping students based on their preferred learning expressions, an activity where students create representations of differentiation using their least preferred style, and notes about various learning preferences and strategies.
This document provides an overview of starting an exclusive agency with Allstate. Key points:
- Allstate offers support like brand recognition, education programs, marketing materials and 24/7 claims service to help agents build their business.
- To get started, agents need available capital, property and casualty and life/health licenses, and NASD licenses. Qualities like success, self-motivation and entrepreneurial spirit are important.
- Agents can earn revenue through property and casualty commissions, Allstate financial commissions, bonuses for education, agency establishment and development based on tiers of performance levels.
- Allstate provides comprehensive education and support like branded retail environments, yellow pages advertising, executive advantage funds
1) The document discusses the opportunity for technology to improve organizational efficiency and transition economies into a "smart and clean world."
2) It argues that aggregate efficiency has stalled at around 22% for 30 years due to limitations of the Second Industrial Revolution, but that digitizing transport, energy, and communication through technologies like blockchain can help manage resources and increase efficiency.
3) Technologies like precision agriculture, cloud computing, robotics, and autonomous vehicles may allow for "dematerialization" and do more with fewer physical resources through effects like reduced waste and need for transportation/logistics infrastructure.
The document discusses radiometric corrections for remote sensing images. It describes how digital numbers are converted to top-of-atmosphere reflectance values using calibration coefficients and solar irradiance normalization. Atmospheric corrections are needed to estimate top-of-canopy reflectance and account for effects of gas absorption, scattering, and emission using a radiative transfer model like 6S. Parameters for the 6S model include viewing geometry, atmospheric properties, and spectral filter functions. Aerosol optical thickness can be obtained from Aeronet ground stations. Radiometric calibration is needed using reference reflectance panels.
2005-02-01 Co-Retrieval of Aerosol Color and Surface Color from SeaWiFS Satel...Rudolf Husar
The document discusses a method for simultaneously retrieving aerosol optical properties and aerosol-free surface reflectance from satellite images. SeaWiFS satellite data is used to co-retrieve the aerosol optical thickness and surface reflectance over land and ocean pixels. Iterative procedures are used to separate the effects of aerosols and surfaces on measured reflectance. Preliminary results over the northeastern US demonstrate the ability to map aerosol patterns and correct surface reflectance for aerosol effects.
2004-10-14 AIR-257: Satellite Detection of Aerosols Concepts and TheoryRudolf Husar
The document summarizes a course on detecting aerosols using satellites. The course covers:
1) Introduction to satellite aerosol detection, different satellite types and their usage.
2) How satellites detect aerosol events like fires, dust storms and haze.
3) Using satellite data to study aerosols for projects like analyzing their fast response to changing environmental conditions.
4) Applications of satellite data in air quality management and issues around its use.
1. The document analyzes aerosol measurements from Higashi-Osaka, Japan to classify aerosol types into six categories and correlate aerosol optical thickness (AOT) with particulate matter (PM).
2. Aerosols were classified using k-means clustering of AERONET data into categories like dust, biomass burning, and pollution. Approximate size distributions were proposed to characterize each category.
3. Correlating AOT and PM measurements improved PM2.5 estimation from AOT by considering anthropogenic versus dust aerosols separately.
4. Aerosol retrieval algorithms were developed using the proposed aerosol models and properties to interpret MODIS data for heavy
Aerosol retrieval using modis data & rt codeAhmad Mubin
This document summarizes information about aerosols including their sources, sizes, health effects, measurement techniques, and a methodology for satellite aerosol retrieval. It discusses using MODIS data at 500m resolution to estimate aerosol optical thickness over Hong Kong and compares the results to AERONET ground measurements. Key steps in the methodology include calculating top-of-atmosphere reflectance, accounting for Rayleigh scattering, surface reflectance, gas transmissions, and atmospheric effects to derive aerosol reflectance and optical thickness.
This document provides an overview of vibrational spectroscopy, specifically reflection absorption infrared spectroscopy (RAIRS). It discusses how RAIRS works by directing infrared radiation at a sample surface, analyzing the reflected beam to determine absorbed frequencies. RAIRS has excellent energy resolution and can study surface species and reactions under various conditions. It is most sensitive for observing adsorption of molecules with transition dipoles arranged along the surface normal. The document also covers instrumentation, theory, selection rules, examples of RAIRS analysis, and limitations.
This document summarizes airborne L-band radiometric measurements made by the CAROLS and SMOS instruments in the Gulf of Biscay in November 2010. CAROLS collected active microwave data from the C-band STORM scatterometer and passive microwave brightness temperature data from its L-band radiometer. SMOS also collected an L-band snapshot on November 25th. Analysis showed brightness temperatures depended on wind speed and scattered galactic noise. SMOS data was highly contaminated by radio frequency interference. The campaign provided an opportunity to validate SMOS sea surface salinity retrievals against in situ measurements and airborne data.
The document discusses the challenge of characterizing particulate matter using remote sensing data due to the complex and multidimensional nature of aerosols. It presents results from using SeaWiFS satellite data combined with surface observations to characterize aerosols over the US from 2000-2003. Specific cases studied include quantifying smoke emissions from agricultural fires in Kansas in 2003 by analyzing the optical thickness and shape of smoke plumes. Summer climatologies of aerosol optical thickness over the US from 2000-2004 are also shown.
2004-10-03 Co-retrieval of Aerosol and Surface Reflectance: Analysis of Daily...Rudolf Husar
The document discusses a method for co-retrieving aerosol and surface reflectance from daily SeaWiFS satellite data from 2000-2002. It describes preprocessing the raw satellite data, identifying preliminary clear days using a time series analysis approach, and further refining the surface reflectance estimates and removing residual haze through spectral analysis. Results show seasonal surface reflectance maps for the eastern and western US and an 8-month animation of the retrieved data.
2004-10-03 Co-retrieval of Aerosol and Surface Reflectance: Analysis of Daily...Rudolf Husar
The document discusses a method for co-retrieving aerosol and surface reflectance from daily SeaWiFS satellite data from 2000-2002. It describes preprocessing the raw satellite data, identifying preliminary clear days using a time series analysis approach, and further refining the surface reflectance estimates and removing residual haze through spectral analysis. Results show seasonal surface reflectance maps for the eastern and western US and an 8-month animation of the retrieved data.
Geophysical methods such as well logs and seismic studies are used to correlate and map rock layers where there is no surface exposure. Well logs record information from probes in boreholes, measuring properties like density, permeability, and pore fluid content. Seismic studies involve generating sound waves that reflect off subsurface interfaces, allowing approximation of rock layer geometry. These remote techniques provide data to interpret stratigraphy where direct observation is not possible.
Quantitative and Qualitative Seismic Interpretation of Seismic Data Haseeb Ahmed
This document discusses quantitative and qualitative seismic interpretation techniques used to analyze seismic data and map subsurface geology. It compares traditional qualitative techniques to more modern quantitative techniques. It then focuses on unconventional seismic interpretation techniques used for unconventional reservoirs with low permeability, including AVO analysis, seismic inversion, seismic attributes, and forward seismic modeling. These techniques can help identify tight gas, shale gas, and gas hydrate reservoirs that conventional methods cannot easily detect. The document provides details on how each technique works and its advantages.
. Atmospheric window and reflectance curvemarutiChilame
The document discusses atmospheric windows and spectral reflectance curves. It states that the atmosphere selectively transmits certain wavelengths, known as atmospheric windows, which are present in the visible and infrared regions. It also explains that different materials reflect and absorb light differently at varying wavelengths, shown via spectral reflectance curves. These curves plot wavelength against reflectance and vary for different materials like vegetation, soil, and water.
2005-12-05 Aerosol Characterization Using the SeaWiFS Sensor and Surface DataRudolf Husar
This document discusses the challenge of characterizing particulate matter using satellite and surface data due to the complex six-dimensional nature of aerosols. It presents an approach using SeaWiFS satellite data along with surface observations to derive patterns of dust, smoke and haze over the United States from 2000 to 2003. Specific examples are given of using this method to estimate smoke emissions from agricultural fires in Kansas in April 2003 and analyze seasonal and regional trends in aerosol optical thickness from SeaWiFS data over summers from 2000 to 2004.
2004-06-24 Co-retrieval of Aerosol and Surface Reflectance: Analysis of Daily...Rudolf Husar
The document summarizes a method for co-retrieving aerosol and surface reflectance from daily SeaWiFS satellite data from 2000-2002. It describes how aerosols scatter and absorb incoming radiation, obscuring the surface reflectance detected by the sensor. The method uses a time series analysis to identify clear "anchor" days with minimal aerosol scattering to retrieve the surface reflectance. It then uses a radiative transfer model along with the surface reflectance values to iteratively retrieve the aerosol optical thickness and refine the surface reflectance estimates. Results show seasonal changes in surface reflectance over eastern and western US regions.
This document describes a study that used satellite measurements from the Cross-track Infrared Sounder (CrIS) onboard the Suomi NPP satellite to directly measure atmospheric isoprene on a global scale. The authors developed an algorithm to retrieve isoprene column abundances from CrIS spectral measurements. They applied this algorithm over the Amazon region, a major isoprene source, and found the results were consistent with model predictions and in-situ measurements, demonstrating the feasibility of direct global satellite measurements of isoprene. Combining these measurements with formaldehyde observations could help constrain atmospheric oxidation over isoprene source regions.
Atmospheric aerosols are particles in the air that can affect climate in various ways. They can cool the climate by reflecting sunlight, but also impact clouds and precipitation. Aerosols have likely offset some warming from greenhouse gases in the past, but exactly how much is unclear. The presenter studies aerosols using climate models to better understand their effects on climate and how their future reduction may influence additional warming from rising carbon dioxide levels.
Similar to 1148_DAILY_ESTIMATES_OF_THE_TROPOSPHERIC_AEROSOL_OPTICAL_THICKNESS_OVER_LAND_SURFACE_FROM_MSG_GEOSTRATIONARY_OBSERVATION.pdf (20)
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
1) The document describes a segmentation algorithm for polarimetric SAR (PolSAR) data that can model both scalar-texture and multi-texture scattering.
2) The algorithm uses log-cumulants and hypothesis testing to determine whether a scalar-texture or dual-texture model best fits the data within each segment.
3) The algorithm is tested on simulated multi-texture PolSAR data and is shown to accurately segment the classes and estimate their texture parameters. However, when applied to real data sets, the algorithm only finds the simpler scalar-texture case.
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
This document discusses using wavelet transforms to analyze two-point statistics of polarimetric synthetic aperture radar (PolSAR) data. It introduces wavelet variance and kurtosis as metrics that can be applied to PolSAR data transformed using a wavelet frame. It then provides an example of applying this analysis to ALOS PALSAR data over Hawaii's Papau Seamount to characterize sea surface features.
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
The Sentinel-1 mission is part of the GMES program and consists of two satellites to provide C-band SAR data for emergency response, marine and land monitoring, and other applications. The satellites operate in a near-polar orbit with a 12 day repeat cycle. The main acquisition mode is an interferometric wide swath mode with 5m range and 20m azimuth resolution over a 250km swath. Sentinel-1 will support operational services and create a long-term SAR data archive.
The document summarizes the status of the GMES Space Component program. It describes the Sentinel satellite missions for monitoring land, ocean, atmosphere and emergency situations. The Sentinels will provide long-term data continuity as well as improved coverage compared to existing missions. Sentinel data will be freely and openly available to both operational users and the science community. The program is on track, with the first Sentinel launches beginning in 2013.
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
The document describes the progress of the development of CFOSAT SCAT, a Ku-band scatterometer onboard the Chinese-French Oceanography Satellite (CFOSAT). CFOSAT will measure global ocean surface winds and waves to improve weather forecasting, ocean dynamics modeling, climate research, and understanding of surface processes. The SCAT instrument is a rotating fan-beam radar scatterometer that will retrieve wind vectors using measurements of backscatter at incidence angles from 26 to 46 degrees. It has a wide swath of over 1000km and specifications are designed to achieve high-precision wind measurements globally. System details including parameters and the operation mode are provided.
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
The document describes the SAP4PRISMA project which aims to develop algorithms and products to support the Italian hyperspectral PRISMA Earth observation mission. The project will focus on data processing, quality assessment, classification methods, and generating level 3 and 4 products for applications like land monitoring, agriculture, and hazard monitoring. It will include the generation of "PRISMA-like" synthetic test data to support algorithm development and validation. The research will be carried out across multiple work packages focusing on topics like data quality, classification methods, calibration/validation, and developing applicative products.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different sites, including a Zambian woodland and North Carolina forest sites.
3) Time series of Hyperion data at flux tower sites show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
1) The EO-1 Hyperion instrument has collected over 65,000 scenes over its 12-year mission to study land and coastal ecosystems using imaging spectroscopy.
2) Studies using Hyperion data have identified spectral indices related to chlorophyll that correlate with carbon flux measurements at different forest, grassland, and woodland sites globally.
3) Time series of Hyperion data at sites in Zambia, North Carolina, and Kansas show seasonal changes in these spectral indices that match patterns in ecosystem carbon uptake and release measured by flux towers.
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
EO-1/Hyperion has been collecting hyperspectral imagery for over 12 years, acquiring over 65,000 scenes. Researchers have been using these data to develop and validate algorithms for estimating vegetation properties like fraction of absorbed photosynthetically active radiation (fAPAR) and photochemical reflectance index (PRI). Comparisons of Hyperion data to field measurements at flux tower sites show these algorithms can accurately track vegetation changes over time and relate spectral properties to productivity metrics like light use efficiency and gross ecosystem productivity. This work is helping prototype data products for the upcoming HyspIRI mission.
This document is a return and exchange form for a wetsuit company. It provides instructions for customers to fill out when returning an undamaged item for a refund, exchange, or size change. The form requests information like the customer's order details, contact information, the suit being returned and its size, the reason for return, and if applicable, the new desired size. It also provides the return shipping address and notifies customers that the company is not responsible for lost or damaged return packages.
This document provides instructions for clients of Fox Tax Planning and Preparation for preparing to have their taxes filed. It lists important income and deduction documentation to bring to an appointment, such as W-2s, 1099s, receipts for donations. It also includes an engagement letter detailing the services to be provided, responsibilities of both parties, fees, and electronic filing and signature procedures. Clients are asked to sign the letter agreeing to the terms and return it along with their tax information.
The document discusses mapping wetlands in North America using MODIS 500m imagery. It describes wetlands and existing global wetland databases. The methodology uses MODIS data from 2008, digital elevation models, and reference data to classify wetlands into three types - forest/shrub dominant wetlands, herbaceous dominant wetlands, and sea grass dominant wetlands. Training data is collected from existing land cover maps and Landsat imagery. A decision tree model and maximum likelihood classification are applied to extract wetlands from other land covers.
The document summarizes research using SBAS-DInSAR (Small BAseline Subset differential interferometric synthetic aperture radar) techniques to analyze ground deformation at Mt. Etna volcano in Italy over the last 18 years using ERS and ENVISAT satellite data. The analysis revealed three main deformation processes: inflation of the volcanic edifice, subsidence of sectors on the eastern flank due to gravitational spreading, and deflation-inflation cycles associated with eruptive and post-eruptive activity. More recent analysis using higher resolution COSMO-SkyMed data from 2009-2010 detected deformation related to faults and a 2010 earthquake more precisely than lower resolution ENVISAT data.
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Northern Engraving | Modern Metal Trim, Nameplates and Appliance PanelsNorthern Engraving
What began over 115 years ago as a supplier of precision gauges to the automotive industry has evolved into being an industry leader in the manufacture of product branding, automotive cockpit trim and decorative appliance trim. Value-added services include in-house Design, Engineering, Program Management, Test Lab and Tool Shops.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
1. Aerosol Optical Depth based on a temporal and
directional analysis of SEVIRI observations
Dominique Carrer, Olivier Hautecoeur, and Jean-Louis Roujean
CNRM-GAME
Météo-France / CNRS
Toulouse, France
2. Introduction
Determination of the aerosol load is at the core of many
applications: epidemiologic risk, food security, air quality,
health, weather forecasting, climate change detection and the
hydrological cycle.
Aerosols essentially originate from human activities, dust
storms, biomass burning, vegetation, sea, volcanoes, and also
from the gas-to-particule conversion mechanism.
Aerosols: fine solid particles or liquid droplets in suspension
in the atmosphere
– Sea salt (SS), dust (DU), sulphate (SU), particle organic matter
(OM), black carbon (BC)
➢ A mixing of aerosol classes from different sources of emission is generally
observed and the aerosols interact rapidly with trace gases and water.
The type and amount of aerosols in the atmosphere vary greatly from day
to day and place to place
IGARSS Conference 2011, Vancouver, Canada
3. Principles and methodology
Main difficulty of aerosol detection is the separation of the contributions to the
measured signal arising from atmospheric scattering and surface reflectance.
Quantitative assessment of the aerosol load from a retrieval of Aerosol Optical Depth
Optimum exploitation of the 4 dimensions of the signal to characterize aerosols:
– Spatial (contrast reduction, aerosol layer more homogeneous than clouds)
– Spectral (Angström coefficient → aerosol type)
– Temporal (aerosol components evolute more quickly than surface components)
– Directional (aerosols and surface exhibit different angular signature)
➔
Proposed method
➢
Separates aerosol signal from the surface (vegetation,
desert, snow) under clear sky conditions
➢
Simultaneous inversion of surface and aerosol properties
IGARSS Conference 2011, Vancouver, Canada
4. Principles and methodology
Daily collection of « apparent » surface reflectance describes the directionality of the
ground surface reflectance
– Since aerosol and surface reflectance have different directional behaviour and different
temporal evolution, it is possible to discriminate the aerosol signal from the signal measured by
satellite.
Joint retrieval of aerosol optical thickness and surface bidirectional reflectance
distribution function (BRDF)
– Derived from the operational surface albedo processing chain
– Daily estimate of AOT over land
– No spectral information is used, only VIS06 is used
– No a priori information on aerosol load nor on aerosol type
IGARSS Conference 2011, Vancouver, Canada
5. Principles and methodology
Top of Atmosphere reflectance
Gaseous absorption
Molecular scattering
Top of Layer reflectance
Aerosol scattering
Surface reflectance
Scattering and absorption properties of the atmosphere are treated separately for
aerosols and molecules
– Removal of gas absorption and Rayleigh scattering on “apparent” reflectance
– Joint retrieval of AOD and surface BRDF
➢ Coupling molecular / H2O absorption and aerosols scattering are neglected
IGARSS Conference 2011, Vancouver, Canada
6. Principles and methodology
Top of Layer Reflectance
Aerosol
Classical radiative transfer equation
Reflectance [Lenoble, 1985]
– One scattering layer
Aerosol Scattering – Surface reflectance as a boundary
Downward Upward condition
Transmission Transmission
Spherical
Albedo
AOT
T , T a ,
s v
ToL , , = a
s v s , a , ,
s v s v
1−S a s
Surface Reflectance
Aerosol Reflectance
Surface Reflectance
IGARSS Conference 2011, Vancouver, Canada
7. Model parametrization
Method:
-discriminate directional signatures of the surface and aerosols by isolating at high solar
angles the higher sensitivity to atmospheric properties.
-use Kalman Filter with different characteristic time scale for land and atmospheric
variations
1
ρ TOL (θ s ,θ v , φ ) = T ↓ (θ s ;τ )T ↑ (θ v ;τ ) ρ s (θ s , θ v , φ ) + ρ aer (θ s , θ v , φ ;τ )
1 − Sρ e
2
ρ s (θ s , θ v , φ ) = ∑k
i =0
i . f i (θ s ,θ v , φ )
f (θ , θ , φ ) = 1
0 s v
(Roujean et al., 1992) 1 1
f 1 (θ s , θ v , φ ) = [(π − φ ) cos φ + sin φ ] tan θ s tan θ v − (tan θ s + tan θ v + tan θ s 2 + tan θ v 2 − 2 tan θ s tan θ v cos φ )
2π π
R (ϑ, ϑ, φ
s v ) 2 (θ s
k ,φ ) 4 ϑϑφ
=k iso f+, θ vgeo= f geo1( [(s , ξ ) cos,ξ + sin ξ+ vol f vol ( s , v , )
π
− v ) ] − 1k ϑϑφ
3π µ s + µ v 2 3
isotropic geometric volumique
ρ aer (θ s , θ v , φ ;τ ) =
ω 1 1
4 µ s µv η
[ P ( ξ ) + H ( µ s ) H ( µ v ) − 1] 1 − e −ητ [ ]
IGARSS Conference 2011, Vancouver, Canada
8. Model parametrization
Aerosols and surface reflectance form a single BRDF model decomposed into a series of
angular kernels representing elementary photometric processes
3
'
ToL s ,v , , =∑ k i f i s , v , ,
i=0
➢Pseudo linear theory (surface/aerosol coupling is non-linear)
➢All components are analytical (the model is differentiable)
Surface contribution Direct aerosols contribution
' T s , T a , 0 P 1−e−m
f i=0,2 s ,v , , = a 1−S 〈 v 〉 f i s , v , '
f s , v , , =
3
4 s v m
f ms
a s
T a ,=e−/ e−u −v −w
2
7−
f ms =1
− / −/ 5
S a = a e b e c
Rozanov and Kokhanovsky , 2006
u , v , w depend on and g
a , b , c , , are constants parameterized by g
Kokhanovsky et al. , 2005
f 0 s , v , =1
1 1
f 1 s , v , =
2
[ − cos sin ]− tan stan v tan s 2tan v 2−2 tan s tan v cos
f 2 s , v , =
4 1
3 s v 2[
− cos sin −
] 1
3
Roujean et al. ,1992
IGARSS Conference 2011, Vancouver, Canada
9. Mathematical design
Kalman filter approach
Z= FK
1
Z =[ ToL 1 ,... , N N , v , N ]
1,
s
1,
v ToL s
N
vector of N observations
K=[ k 0, k 1, k 2, ] vector of parameters
F =[ f ' 0, f ' 1, f ' 2, f ' 3 ] matrix of angular kernel functions
{
T −1 −1
A BC ap K ap C reg K reg
K=
C −1
k
−1 −1
C k = A AC ap C reg
T −1
covariance matrix
A , B scaled matrices for Z , F , normalized by the standard error ToL
Our semiphysical approach aims to derive an algorithm that performs efficiently
Ill-conditioning is avoid using regulation terms Kreg and Creg
A persistent algorithm using prior information Kap and Cap
State variable K is estimated in adopting a recursive procedure
IGARSS Conference 2011, Vancouver, Canada
10. Two steps process
DEM
Atmosphere LSM
characterisation
ECMWF forecasts
TOA Partial TOL
Cloud TOL
SEVIRI atmospheric radiances
mask radiances
radiances correction screened
Surface
reflectance
All clear data are used at full
Inversion process: Aerosol
resolution unmixing aerosol/surface product
SAF-NWC CMa product is used here
IGARSS Conference 2011, Vancouver, Canada
11. Validation against AERONET data sets
Daily MSG AOT values are compared to AERONET ground measurements.
Location of the AERONET
stations investigated in the
present study
IGARSS Conference 2011, Vancouver, Canada
13. Validation with AERONET stations in Europe
bias=-0.026
stdev=0.104 AERONET
R=0.54
SEVIRI
False cloud
bias=-0.027
detection ? stdev=0.112
R=0.56
Daily AOD
bias=-0.022
stdev=0.089
R=0.69
IGARSS Conference 2011, Vancouver, Canada
14. Validation with AERONET stations in Africa
bias=-0.028
stdev=0.092
R=0.83 AERONET
SEVIRI
bias=-0.011
stdev=0.233
R=0.90
Daily AOD
bias=-0.122
stdev=0.277
R=0.75
IGARSS Conference 2011, Vancouver, Canada
15. Monitoring an aerosol event
AOD estimated for SEVIRI visible band
AOD from MODIS product
superimposed over ocean (0.5°)
Good consistency is noticed with
AOD up to 3 and beyond...
IGARSS Conference 2011, Vancouver, Canada
16. Monitoring an aerosol event
SEVIRI AOD in black
AERONET AOD in green
over 6 Western African sites, March 1st-21th, 2006
IGARSS Conference 2011, Vancouver, Canada
18. AOT vs density of urbanization
Mean AOT from Monday 20060529 to Sunday 20060702 (5 complete weeks) versus day of the
week and town density in a region including Europe and North Africa.
Three categories were established using the GLC2000 land cover classification: MSG/SEVIRI
pixels containing less than 30%, between 30% and 90%, and more than 90% of the class 'artificial
surfaces'.
IGARSS Conference 2011, Vancouver, Canada
19. Method Approximations
Mie phase fonction (colour) for representative aerosol types.
Henyey-Greenstein (black) for g=0.6 (solid) and g=0.75 (dash)
Some aerosol types are particular sensitive to the particule size (DU,SS) while other (OM,SU)
present characteristics depending on relative humidity.
g=0.3
g=0.6
g=0.75
IGARSS Conference 2011, Vancouver, Canada
20. SEVIRI angular sampling
Min/Max of scattering
angle
– Varies in place and
time
➢ Aerosol type could not be
discriminated everywhere
on the disk
➢ Our physical assumptions
seem adapted to the
angular capabilities that
are offered by
MSG/SEVIRI.
IGARSS Conference 2011, Vancouver, Canada
21. Conclusion
A method was presented to retrieve the aerosol optical depth
– Based on a joint retrieval of AOD and surface reflectance. The angular shape of BRDF is
particularly sensitive to the presence of aerosols and allows aerosol and surface signals to be
separated.
– Working for any surface type (including bright targets)
– Validated against AERONET and MODIS data (bias < 0,03)
– Relied on simple model (only analytical formulas not a “black box”)
– Hypothesis and limits well identified
Compact code
– Framework in C++, ~ 2200 LOC
– Easy to maintain and upgrade
Low computational resources required
– One day of data: 96 slots full disk
– Run time : ~ 3h on a PC workstation
• 2h for preprocess and partial atmospheric correction
• 1h for joint aerosol/surface inversion
➢ Suitable to be integrated in an operational centre
IGARSS Conference 2011, Vancouver, Canada
22. On-going developments
Introduction of a simplified water BRDF reflectance model
– To adapt the method for ocean in designing a BRDF adapted to sea surface
Use of the three solar channels for aerosol type discrimination
– To exploit the spectral and angular information to derive the aerosol class. Angström
coefficient determination
Continuous work to increase the grid resolution and extend the geographical
coverage
– To include data from different instruments (does not require further methodological
developments).
Analysis of the input signal
– For error/uncertainty determination
Cloud mask
– To recover strong aerosol episodes and filter residual clouds or thin cirrus
IGARSS Conference 2011, Vancouver, Canada
23. Carrer, D., J.-L. Roujean, O. Hautecoeur, and T. Elias (2010),
Daily estimates of aerosol optical thickness over land surface based on a
directional and temporal analysis of SEVIRI MSG visible observations,
J. Geophys. Res., 115, D10208, doi:10.1029/2009JD012272.
dominique.carrer@meteo.fr
This link can be used for 200 accesses - login ID and password: 80387941
http://www.agu.org/journals/jd/jd1010/2009JD012272/