2005-10-31 Characterization of Aerosol EventsRudolf Husar
This document summarizes research on characterizing aerosol events using monitoring data. It discusses:
- Long-term monitoring networks that measure particulate matter and species over hundreds of sites
- Tools like analysts' consoles that use spatial and temporal data to help characterize events
- Methods for decomposing temporal signals to identify seasonal, random, and event components
- Examples of analyzing specific aerosol events across the Eastern US using these tools and data.
The document discusses ICCC activities related to developing methodologies for monitoring drivers of fires and haze in Indonesia and estimating greenhouse gas emissions. Key outputs include protocols for monitoring drivers, an early warning system, and more accurate estimates of emissions, human health impacts, and patterns of drivers. It also summarizes challenges in detecting smoldering peatland fires using satellites and presents preliminary findings from a project using nighttime satellite data to estimate peatland fire emissions.
2005-06-03 Aerosol Characterization and the Supporting Information Infrastruc...Rudolf Husar
The document discusses aerosol characterization and monitoring infrastructure. It describes the goal of attaining natural regional haze conditions by 2064 according to the Regional Haze Rule. It also discusses the National Ambient Air Monitoring Strategy and related initiatives like FASTNET and DataFed that pursue its recommendations around enhanced monitoring. Finally, it outlines some of the technical challenges around fully characterizing the multidimensional aerosol system.
2004-06-24 Fast Aerosol Sensing Tools for Natural Event Tracking FASTNET Proj...Rudolf Husar
The document discusses the FASTNET project which aims to better characterize natural haze conditions through the analysis of major natural aerosol events like forest fires and dust storms. The goal is to develop tools for data access, archiving, and analysis to describe the spatial, temporal, and compositional features of natural aerosols. This will help understand their contribution to regional haze and establish baseline natural conditions as required by the Regional Haze Rule.
The 2018 drought significantly impacted the carbon, water and energy dynamics of a mature Sitka spruce forest and a recently restocked clear-felling site located on organo-mineral soil. At both sites, low precipitation and high evapotranspiration rates during the drought led to soil water depletion that did not recover by the end of 2018. This caused reduced photosynthesis, higher water stress, and lower carbon sink strength compared to previous years. Specifically, the mature forest shifted from a strong carbon sink to neutral, while the restocked site became a carbon source. The drought also increased heat losses through transpiration and latent heat flux at both sites.
On Friday, July 19, 2019 ICLR conducted a Friday Forum webinar titled 'The Northern tornadoes project: Identifying every tornado in Canada', led by Dr. Greg Kopp from Western University. Severe weather is causing ever increasing damage and losses in Canada, with wind being one of the major parameters. In spite of this, no organization in Canada has the responsibility to systematically identify and classify every tornado occurrence in Canada. The Meteorological Service of Canada has the responsibility of warning Canadians about severe or dangerous weather. When there is public interest, they examine and report on damage after the fact; they also house the official archive of tornado damage reports and occurrences. However, according to recent research using lightning and other data, this archive is believed to be incomplete with perhaps only about one-third of tornadoes being identified. Examining maps of Canada’s tornado occurrence locations and population density highlights the problem, with the “missing tornadoes” expected to be primarily in regions with low population density. A collaboration between Western University and Environment and Climate Change Canada, the Northern Tornadoes Project (NTP) aims to address this issue by systematically working to identify all tornado occurrences in the country and archive as much information as possible for each of them. The ultimate goal of NTP is to improve severe weather prediction, mitigate against damage to people and property, and investigate future implications due to climate change. The seminar presented the overall project goals and methods, along with the outcomes from 2018.
Dr. Greg Kopp is the lead researcher in the Northern Tornadoes Project and a professor in Western University’s Department of Civil and Environmental Engineering. He received a B.Sc.M.E. from the University of Manitoba in 1989, a M.Eng. from McMaster University in 1991 and a Ph.D. in Mechanical Engineering from the University of Toronto in 1995. His expertise and research relate to mitigating damage to structures during extreme wind storms such as tornadoes and hurricanes. He works actively to implement research findings into practice, currently serving as Chair of the ASCE 49 Standards Committee on Wind Tunnel Testing For Buildings and other Structures, and as a member of various other Building Code committees. A former Canada Research Chair in Wind Engineering, he is also the lead researcher for the Three Little Pigs Project at The Insurance Research Lab for Better Homes.
The document discusses the challenge of characterizing particulate matter using remote sensing data due to the complex and multidimensional nature of aerosols. It presents results from using SeaWiFS satellite data combined with surface observations to characterize aerosols over the US from 2000-2003. Specific cases studied include quantifying smoke emissions from agricultural fires in Kansas in 2003 by analyzing the optical thickness and shape of smoke plumes. Summer climatologies of aerosol optical thickness over the US from 2000-2004 are also shown.
The document describes the Exceptional Event Decision Support System (EE DSS), a tool to help states and EPA regions implement the EPA's Exceptional Events Rule. The EE DSS uses air quality, meteorological, and other data to screen for exceedances and flag those likely caused by exceptional events like dust storms, wildfires, or July 4th fireworks. It aims to minimize the technical hurdles of the EE rule and provide a uniform, transparent methodology. The document outlines the EE DSS's data sources and modeling, screening approach, tools for visualizing events, and provides an example demo of the system in action.
2005-10-31 Characterization of Aerosol EventsRudolf Husar
This document summarizes research on characterizing aerosol events using monitoring data. It discusses:
- Long-term monitoring networks that measure particulate matter and species over hundreds of sites
- Tools like analysts' consoles that use spatial and temporal data to help characterize events
- Methods for decomposing temporal signals to identify seasonal, random, and event components
- Examples of analyzing specific aerosol events across the Eastern US using these tools and data.
The document discusses ICCC activities related to developing methodologies for monitoring drivers of fires and haze in Indonesia and estimating greenhouse gas emissions. Key outputs include protocols for monitoring drivers, an early warning system, and more accurate estimates of emissions, human health impacts, and patterns of drivers. It also summarizes challenges in detecting smoldering peatland fires using satellites and presents preliminary findings from a project using nighttime satellite data to estimate peatland fire emissions.
2005-06-03 Aerosol Characterization and the Supporting Information Infrastruc...Rudolf Husar
The document discusses aerosol characterization and monitoring infrastructure. It describes the goal of attaining natural regional haze conditions by 2064 according to the Regional Haze Rule. It also discusses the National Ambient Air Monitoring Strategy and related initiatives like FASTNET and DataFed that pursue its recommendations around enhanced monitoring. Finally, it outlines some of the technical challenges around fully characterizing the multidimensional aerosol system.
2004-06-24 Fast Aerosol Sensing Tools for Natural Event Tracking FASTNET Proj...Rudolf Husar
The document discusses the FASTNET project which aims to better characterize natural haze conditions through the analysis of major natural aerosol events like forest fires and dust storms. The goal is to develop tools for data access, archiving, and analysis to describe the spatial, temporal, and compositional features of natural aerosols. This will help understand their contribution to regional haze and establish baseline natural conditions as required by the Regional Haze Rule.
The 2018 drought significantly impacted the carbon, water and energy dynamics of a mature Sitka spruce forest and a recently restocked clear-felling site located on organo-mineral soil. At both sites, low precipitation and high evapotranspiration rates during the drought led to soil water depletion that did not recover by the end of 2018. This caused reduced photosynthesis, higher water stress, and lower carbon sink strength compared to previous years. Specifically, the mature forest shifted from a strong carbon sink to neutral, while the restocked site became a carbon source. The drought also increased heat losses through transpiration and latent heat flux at both sites.
On Friday, July 19, 2019 ICLR conducted a Friday Forum webinar titled 'The Northern tornadoes project: Identifying every tornado in Canada', led by Dr. Greg Kopp from Western University. Severe weather is causing ever increasing damage and losses in Canada, with wind being one of the major parameters. In spite of this, no organization in Canada has the responsibility to systematically identify and classify every tornado occurrence in Canada. The Meteorological Service of Canada has the responsibility of warning Canadians about severe or dangerous weather. When there is public interest, they examine and report on damage after the fact; they also house the official archive of tornado damage reports and occurrences. However, according to recent research using lightning and other data, this archive is believed to be incomplete with perhaps only about one-third of tornadoes being identified. Examining maps of Canada’s tornado occurrence locations and population density highlights the problem, with the “missing tornadoes” expected to be primarily in regions with low population density. A collaboration between Western University and Environment and Climate Change Canada, the Northern Tornadoes Project (NTP) aims to address this issue by systematically working to identify all tornado occurrences in the country and archive as much information as possible for each of them. The ultimate goal of NTP is to improve severe weather prediction, mitigate against damage to people and property, and investigate future implications due to climate change. The seminar presented the overall project goals and methods, along with the outcomes from 2018.
Dr. Greg Kopp is the lead researcher in the Northern Tornadoes Project and a professor in Western University’s Department of Civil and Environmental Engineering. He received a B.Sc.M.E. from the University of Manitoba in 1989, a M.Eng. from McMaster University in 1991 and a Ph.D. in Mechanical Engineering from the University of Toronto in 1995. His expertise and research relate to mitigating damage to structures during extreme wind storms such as tornadoes and hurricanes. He works actively to implement research findings into practice, currently serving as Chair of the ASCE 49 Standards Committee on Wind Tunnel Testing For Buildings and other Structures, and as a member of various other Building Code committees. A former Canada Research Chair in Wind Engineering, he is also the lead researcher for the Three Little Pigs Project at The Insurance Research Lab for Better Homes.
The document discusses the challenge of characterizing particulate matter using remote sensing data due to the complex and multidimensional nature of aerosols. It presents results from using SeaWiFS satellite data combined with surface observations to characterize aerosols over the US from 2000-2003. Specific cases studied include quantifying smoke emissions from agricultural fires in Kansas in 2003 by analyzing the optical thickness and shape of smoke plumes. Summer climatologies of aerosol optical thickness over the US from 2000-2004 are also shown.
The document describes the Exceptional Event Decision Support System (EE DSS), a tool to help states and EPA regions implement the EPA's Exceptional Events Rule. The EE DSS uses air quality, meteorological, and other data to screen for exceedances and flag those likely caused by exceptional events like dust storms, wildfires, or July 4th fireworks. It aims to minimize the technical hurdles of the EE rule and provide a uniform, transparent methodology. The document outlines the EE DSS's data sources and modeling, screening approach, tools for visualizing events, and provides an example demo of the system in action.
This document describes how rainfall estimates are created from raw radar data at the National Centers for Environmental Information (NCEI). It explains that multiple radars are combined and merged to get nationwide coverage with 2 minute resolution on a standardized grid. Raw radar returns are cleaned of contamination from birds and other sources. The vertical radar scans are integrated into a single rainfall estimate. The final product provides a consistent, high quality rainfall estimate compared to the original radar data which had poorer spatial and temporal resolution and quality issues.
Climate statisticians analyze observational climate data and model simulations to detect trends, attribute causes, and quantify uncertainties. They use statistical methods like linear regression to attribute observed warming to human and natural factors. They also use extreme value theory to describe rare weather events and project how these extremes may change with continued warming. A key task is quantifying various sources of uncertainty in climate projections, like different model sensitivities and emissions scenarios.
This document describes a Carbon Cycle Fossil Fuel Data Assimilation System (CCFFDAS) that was developed to estimate fossil fuel emissions by assimilating in situ and remotely sensed CO2 observations. The CCFFDAS couples atmospheric transport, terrestrial ecosystem, and fossil fuel emissions models. It was used to assess the potential of CO2 observations from the GOSAT satellite to constrain fossil fuel emissions for a week in 2008. The CCFFDAS provides national and annual scale uncertainties for fossil fuel emissions that are within the range of inventory uncertainties. It also allows exploring options for satellite mission design and surface networks through "verification" and "synergy" modes.
This research summarizes measurements taken during the COPE-MED field campaign to better understand warm rain processes and entrainment effects on heavy precipitation. LWC probes were compared and generally agreed well, though the PVM overestimated LWC at higher concentrations and smaller diameters. A LWC survey with vertical statistics was conducted. Analysis of droplet spectra bimodality from low-precipitation cases found evidence of bimodality but secondary activation was unlikely the cause, with the bimodality mechanism remaining unclear. Future COPE analysis will utilize these LWC and bimodality findings to evaluate hypotheses regarding warm rain processes and entrainment impacts on heavy precipitation.
Effects of Wind Direction on VOC Concentrations in Southeast KansasSergio A. Guerra
Twenty-four-hour whole-air samples were collected in evacuated stainless steel canisters and analyzed for volatile organic compounds (VOC) at selected sites in southeast Kansas from March 1999 to October 2000. The purpose was to assess the influence on air quality of four industrial facilities that burn hazardous waste located in the communities of Coffeyville, Chanute, Independence and Fredonia. Fifteen of the VOC analytes were found at concentrations above the detection limit and above levels observed in the blanks. Data were analyzed to investigate whether sampling site and date had a significant effect on VOC concentration. Results indicate that site and/or date were significant factors for many of the VOCs. To further investigate the temporal factor, sampling days were divided into four classifications based on wind direction: predominantly north winds, predominantly south winds, calm/variable winds and
other winds. Results from statistical analyses show that wind direction was a significant factor for benzene, toluene, o-xylene, naphthalene, and carbon tetrachloride. Data from upwind and downwind samples were analyzed for the four cities of interest in the study area, to investigate the effect of the four targeted sources on VOC concentrations. Results from Fredonia showed higher concentrations of toluene, ethyl benzene, styrene, methyl chloride, and trichloroethylene in the upwind samples, although none of the results were statistically significant. Chanute also showed higher concentrations of the same compounds and m,p-xylene in the upwind samples; results were significant at the 0.05 level for toluene, ethylbenzene, and xylene. These results indicate that sources other than those targeted in the sampling network may be contributing to
the VOC levels. Results from Independence showed higher concentrations of ethylebenzene and styrene in the downwind samples; results were statistically significant. These results indicate that the source targeted in the sampling network may be contributing to the VOC levels at those sampling sites.
Use of Probabilistic Statistical Techniques in AERMOD Modeling EvaluationsSergio A. Guerra
The advent of the short term National Ambient Air Quality Standards (NAAQS) prompted modelers to reassess the common practices in dispersion modeling analyses. The probabilistic nature of the new short term standards also opens the door to alternative modeling techniques that are based on probability. One of these is the Monte Carlo technique that can be used to account for emission variability in permit modeling.
Currently, it is assumed that a given emission unit is in operation at its maximum capacity every hour of the year. This assumption may be appropriate for facilities that operate at full capacity most of the time. However, in most cases, emission units operate at variable loads that produce variable emissions. Thus, assuming constant maximum emissions is overly conservative for facilities such as power plants that are not in operation all the time and which exhibit high concentrations during very short periods of time.
Another element of conservatism in NAAQS demonstrations relates to combining predicted concentrations from the AMS/EPA Regulatory Model (AERMOD) with observed (monitored) background concentrations. Normally, some of the highest monitored observations are added to the AERMOD results yielding a very conservative combined concentration.
A case study is presented to evaluate the use of alternative probabilistic methods to complement the shortcomings of current dispersion modeling practices. This case study includes the use of the Monte Carlo technique and the use of a reasonable background concentration to combine with the AERMOD predicted concentrations. The use of these methods is in harmony with the probabilistic nature of the NAAQS and can help demonstrate compliance through dispersion modeling analyses, while still being protective of the NAAQS.
Effects of Wind Direction on Trace Metal Concentration in Southeast KansasSergio A. Guerra
This study investigated the effects of wind direction on trace metal concentrations in particulate matter in southeast Kansas. Air quality monitoring was conducted from March to October 2000 at nine sites. Concentrations of six metals (beryllium, chromium, arsenic, cadmium, barium, and lead) were analyzed. The results showed that wind direction significantly impacted chromium, barium, and lead concentrations, with calm/variable winds associated with higher chromium and lead levels and north winds linked to higher barium. However, upwind/downwind comparisons found no significant differences near targeted industrial sources. Mean levels of chromium, arsenic, and cadmium exceeded EPA risk levels at some sites.
Sami Siikanen, VTT: Using gas finder infrared camera in detecting landfill me...Valio
This document discusses using gas finder infrared cameras to detect methane leaks from landfills. It summarizes tests conducted at the Kuopio Jätekukko landfill in Finland in 2009 and 2010. The goals of the tests were to locate methane leaks to reduce odors and improve methane collection for energy production. Thermal cameras detected methane leaks appearing as black clouds emanating from well covers and slopes. The cameras allowed locating leaks for repair, which could enhance biogas collection and use. Later tests aimed to further evaluate camera performance in different weather. More advanced cameras may enable visualizing and quantifying specific gas concentrations.
This study uses atmospheric 14CO2 measurements to estimate fossil fuel (ff) CO2 emission hotspots in an urban area along the Rhine valley. Two approaches are compared: 1) an upwind-downwind approach using paired stations to estimate background levels, and 2) a regional background approach. The ffCO2 concentrations ranged from 0-10 ppm. Uncertainties were about 1.2 ppm for the upwind-downwind approach and 50-100% higher for the regional approach. There was also a strong correlation found between total CO2 and ffCO2 offsets across the study area, indicating total CO2 can serve as a proxy for ffCO2. However, accounting for nuclear 14C contamination remained challenging
This document outlines validation plans for the Ozone Mapping and Profiler Suite (OMPS) instrument on the NPOESS Preparatory Project satellite. It discusses:
1) The calibration and validation team members and their roles in characterizing instrument performance through comparisons with other satellite and ground-based instruments from launch through long-term monitoring.
2) The schedule of major validation tasks from pre-launch testing through intensive in-orbit validation in the first two years and transition to long-term monitoring.
3) Examples of early tests and comparisons that will be done with internal instrument measurements, early solar views, and single days of Earth view data to evaluate performance.
Conference on the Environment- GUERRA presentation Nov 19, 2014Sergio A. Guerra
This document discusses innovative dispersion modeling practices to achieve reasonable conservatism in regulatory modeling demonstrations. It presents a case study evaluating the Emissions and Meteorological Variability Processor (EMVAP) and approaches to establish background concentrations. The case study models SO2 concentrations from a power plant using 1) constant emissions, 2) variable emissions, and 3) EMVAP. EMVAP provides more realistic concentrations while accounting for emission variability. Using the 50th percentile monitored background concentration when combining with modeled values provides statistical conservatism compared to using high percentile values.
The document discusses plans to improve the Global Fire Assimilation System (GFAS) within the Copernicus Atmosphere Monitoring Service (CAMS). Key points include:
1) GFAS will use more satellite observations of fire radiative power (FRP), including from geostationary satellites, to better characterize fires over time and reduce errors from individual satellites.
2) It will develop the capability to forecast FRP using weather data and fire indices and represent changes in emission factors over time.
3) These improvements aim to provide a more accurate and stable representation of the global FRP distribution at an hourly temporal resolution.
Estimation of Solar Radiation over Ibadan from Routine Meteorological Parameterstheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Innovative Dispersion Modeling Practices to Achieve a Reasonable Level of Con...Sergio A. Guerra
The document discusses innovative modeling practices to achieve reasonable conservatism in AERMOD modeling demonstrations. It presents a case study evaluating three modeling techniques: EMVAP, which assigns random emission rates over iterations; ARM2, which calculates NOx to NO2 conversion based on plume entrapment; and using the 50th percentile monitored background concentration. The case study found lower modeled concentrations using EMVAP and ARM2 compared to current practices, demonstrating these techniques can provide more realistic results while still protecting air quality standards. Pairing the 98th percentile predicted concentration with the 50th percentile monitored background provided a statistically conservative but reasonable level of conservatism.
On March 15, 33 experts from the Department’s National Nuclear Security Administration (NNSA) arrived in Japan along with more than 17,200 pounds of equipment. After initial deployments at U.S. consulates and military installations in Japan, these teams have utilized their unique skills, expertise and equipment to help assess, survey, monitor and sample areas for radiation. The 33 team members joined another six DOE personnel already in Japan.
Since arriving in Japan, NNSA teams have collected and analyzed data gathered from more than 40 hours of flights aboard Department of Defense aircraft and thousands of ground monitoring points.
2003-10-15 Biomass Smoke Emissions and Transport: Community-based Satellite a...Rudolf Husar
The document discusses biomass smoke emissions and transport patterns in North America as analyzed using satellite and surface data. Key findings include:
- Four main fire zones were identified based on fire size and distribution: Northern, Northwestern, Southeastern, and Mexican.
- Peak fire seasons vary by region, from December to February in Mexico to June to August in Northern Canada and Northwest US.
- Smoke emission and concentration patterns were measured and modeled using various data sources and models. Near-source and distant smoke transport patterns were estimated.
- Characterizing smoke fully requires describing multiple properties including location, time, particle size, composition, shape, and mixtures - a challenge given sparse measurement data.
On March 15, 33 experts from the Department’s National Nuclear Security Administration (NNSA) arrived in Japan along with more than 17,200 pounds of equipment. After initial deployments at U.S. consulates and military installations in Japan, these teams have utilized their unique skills, expertise and equipment to help assess, survey, monitor and sample areas for radiation. The 33 team members joined another six DOE personnel already in Japan.
Using SAR Intensity and Coherence to Detect A Moorland Wildfire ScarGail Millin-Chalabi
This document presents a study that used SAR intensity and coherence to detect a fire scar in a degraded moorland environment in the UK. It describes the methodology, which involved preprocessing SAR data and extracting backscatter values for different land cover classes within the fire scar over time. The results show that precipitation and land cover affected the SAR intensity signal inside the fire scar, with peat bog having the highest returns. InSAR coherence was also analyzed for pairs before and after the fire. The summary concludes that SAR intensity can detect large fire scars but coherence needs more exploration, and recommends investigating different fire scenarios and radar frequencies.
O documento descreve as regras de um jogo de adivinhação de palavras chamado "Forca", no qual as crianças tentam adivinhar uma palavra escolhida pelo professor ou aluno colocando letras, e desenhando partes de uma forca ou boneco se errar, com o objetivo de aprender antes que a forca seja completa.
El efecto óptico Pr4D4 permite manipular la luz de manera dinámica y precisa. Utiliza cristales líquidos para alterar la forma de la luz entrante y crear hologramas en tiempo real. Esto podría usarse para proyectar imágenes 3D sin necesidad de gafas o para mejorar la realidad aumentada.
This document describes how rainfall estimates are created from raw radar data at the National Centers for Environmental Information (NCEI). It explains that multiple radars are combined and merged to get nationwide coverage with 2 minute resolution on a standardized grid. Raw radar returns are cleaned of contamination from birds and other sources. The vertical radar scans are integrated into a single rainfall estimate. The final product provides a consistent, high quality rainfall estimate compared to the original radar data which had poorer spatial and temporal resolution and quality issues.
Climate statisticians analyze observational climate data and model simulations to detect trends, attribute causes, and quantify uncertainties. They use statistical methods like linear regression to attribute observed warming to human and natural factors. They also use extreme value theory to describe rare weather events and project how these extremes may change with continued warming. A key task is quantifying various sources of uncertainty in climate projections, like different model sensitivities and emissions scenarios.
This document describes a Carbon Cycle Fossil Fuel Data Assimilation System (CCFFDAS) that was developed to estimate fossil fuel emissions by assimilating in situ and remotely sensed CO2 observations. The CCFFDAS couples atmospheric transport, terrestrial ecosystem, and fossil fuel emissions models. It was used to assess the potential of CO2 observations from the GOSAT satellite to constrain fossil fuel emissions for a week in 2008. The CCFFDAS provides national and annual scale uncertainties for fossil fuel emissions that are within the range of inventory uncertainties. It also allows exploring options for satellite mission design and surface networks through "verification" and "synergy" modes.
This research summarizes measurements taken during the COPE-MED field campaign to better understand warm rain processes and entrainment effects on heavy precipitation. LWC probes were compared and generally agreed well, though the PVM overestimated LWC at higher concentrations and smaller diameters. A LWC survey with vertical statistics was conducted. Analysis of droplet spectra bimodality from low-precipitation cases found evidence of bimodality but secondary activation was unlikely the cause, with the bimodality mechanism remaining unclear. Future COPE analysis will utilize these LWC and bimodality findings to evaluate hypotheses regarding warm rain processes and entrainment impacts on heavy precipitation.
Effects of Wind Direction on VOC Concentrations in Southeast KansasSergio A. Guerra
Twenty-four-hour whole-air samples were collected in evacuated stainless steel canisters and analyzed for volatile organic compounds (VOC) at selected sites in southeast Kansas from March 1999 to October 2000. The purpose was to assess the influence on air quality of four industrial facilities that burn hazardous waste located in the communities of Coffeyville, Chanute, Independence and Fredonia. Fifteen of the VOC analytes were found at concentrations above the detection limit and above levels observed in the blanks. Data were analyzed to investigate whether sampling site and date had a significant effect on VOC concentration. Results indicate that site and/or date were significant factors for many of the VOCs. To further investigate the temporal factor, sampling days were divided into four classifications based on wind direction: predominantly north winds, predominantly south winds, calm/variable winds and
other winds. Results from statistical analyses show that wind direction was a significant factor for benzene, toluene, o-xylene, naphthalene, and carbon tetrachloride. Data from upwind and downwind samples were analyzed for the four cities of interest in the study area, to investigate the effect of the four targeted sources on VOC concentrations. Results from Fredonia showed higher concentrations of toluene, ethyl benzene, styrene, methyl chloride, and trichloroethylene in the upwind samples, although none of the results were statistically significant. Chanute also showed higher concentrations of the same compounds and m,p-xylene in the upwind samples; results were significant at the 0.05 level for toluene, ethylbenzene, and xylene. These results indicate that sources other than those targeted in the sampling network may be contributing to
the VOC levels. Results from Independence showed higher concentrations of ethylebenzene and styrene in the downwind samples; results were statistically significant. These results indicate that the source targeted in the sampling network may be contributing to the VOC levels at those sampling sites.
Use of Probabilistic Statistical Techniques in AERMOD Modeling EvaluationsSergio A. Guerra
The advent of the short term National Ambient Air Quality Standards (NAAQS) prompted modelers to reassess the common practices in dispersion modeling analyses. The probabilistic nature of the new short term standards also opens the door to alternative modeling techniques that are based on probability. One of these is the Monte Carlo technique that can be used to account for emission variability in permit modeling.
Currently, it is assumed that a given emission unit is in operation at its maximum capacity every hour of the year. This assumption may be appropriate for facilities that operate at full capacity most of the time. However, in most cases, emission units operate at variable loads that produce variable emissions. Thus, assuming constant maximum emissions is overly conservative for facilities such as power plants that are not in operation all the time and which exhibit high concentrations during very short periods of time.
Another element of conservatism in NAAQS demonstrations relates to combining predicted concentrations from the AMS/EPA Regulatory Model (AERMOD) with observed (monitored) background concentrations. Normally, some of the highest monitored observations are added to the AERMOD results yielding a very conservative combined concentration.
A case study is presented to evaluate the use of alternative probabilistic methods to complement the shortcomings of current dispersion modeling practices. This case study includes the use of the Monte Carlo technique and the use of a reasonable background concentration to combine with the AERMOD predicted concentrations. The use of these methods is in harmony with the probabilistic nature of the NAAQS and can help demonstrate compliance through dispersion modeling analyses, while still being protective of the NAAQS.
Effects of Wind Direction on Trace Metal Concentration in Southeast KansasSergio A. Guerra
This study investigated the effects of wind direction on trace metal concentrations in particulate matter in southeast Kansas. Air quality monitoring was conducted from March to October 2000 at nine sites. Concentrations of six metals (beryllium, chromium, arsenic, cadmium, barium, and lead) were analyzed. The results showed that wind direction significantly impacted chromium, barium, and lead concentrations, with calm/variable winds associated with higher chromium and lead levels and north winds linked to higher barium. However, upwind/downwind comparisons found no significant differences near targeted industrial sources. Mean levels of chromium, arsenic, and cadmium exceeded EPA risk levels at some sites.
Sami Siikanen, VTT: Using gas finder infrared camera in detecting landfill me...Valio
This document discusses using gas finder infrared cameras to detect methane leaks from landfills. It summarizes tests conducted at the Kuopio Jätekukko landfill in Finland in 2009 and 2010. The goals of the tests were to locate methane leaks to reduce odors and improve methane collection for energy production. Thermal cameras detected methane leaks appearing as black clouds emanating from well covers and slopes. The cameras allowed locating leaks for repair, which could enhance biogas collection and use. Later tests aimed to further evaluate camera performance in different weather. More advanced cameras may enable visualizing and quantifying specific gas concentrations.
This study uses atmospheric 14CO2 measurements to estimate fossil fuel (ff) CO2 emission hotspots in an urban area along the Rhine valley. Two approaches are compared: 1) an upwind-downwind approach using paired stations to estimate background levels, and 2) a regional background approach. The ffCO2 concentrations ranged from 0-10 ppm. Uncertainties were about 1.2 ppm for the upwind-downwind approach and 50-100% higher for the regional approach. There was also a strong correlation found between total CO2 and ffCO2 offsets across the study area, indicating total CO2 can serve as a proxy for ffCO2. However, accounting for nuclear 14C contamination remained challenging
This document outlines validation plans for the Ozone Mapping and Profiler Suite (OMPS) instrument on the NPOESS Preparatory Project satellite. It discusses:
1) The calibration and validation team members and their roles in characterizing instrument performance through comparisons with other satellite and ground-based instruments from launch through long-term monitoring.
2) The schedule of major validation tasks from pre-launch testing through intensive in-orbit validation in the first two years and transition to long-term monitoring.
3) Examples of early tests and comparisons that will be done with internal instrument measurements, early solar views, and single days of Earth view data to evaluate performance.
Conference on the Environment- GUERRA presentation Nov 19, 2014Sergio A. Guerra
This document discusses innovative dispersion modeling practices to achieve reasonable conservatism in regulatory modeling demonstrations. It presents a case study evaluating the Emissions and Meteorological Variability Processor (EMVAP) and approaches to establish background concentrations. The case study models SO2 concentrations from a power plant using 1) constant emissions, 2) variable emissions, and 3) EMVAP. EMVAP provides more realistic concentrations while accounting for emission variability. Using the 50th percentile monitored background concentration when combining with modeled values provides statistical conservatism compared to using high percentile values.
The document discusses plans to improve the Global Fire Assimilation System (GFAS) within the Copernicus Atmosphere Monitoring Service (CAMS). Key points include:
1) GFAS will use more satellite observations of fire radiative power (FRP), including from geostationary satellites, to better characterize fires over time and reduce errors from individual satellites.
2) It will develop the capability to forecast FRP using weather data and fire indices and represent changes in emission factors over time.
3) These improvements aim to provide a more accurate and stable representation of the global FRP distribution at an hourly temporal resolution.
Estimation of Solar Radiation over Ibadan from Routine Meteorological Parameterstheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Innovative Dispersion Modeling Practices to Achieve a Reasonable Level of Con...Sergio A. Guerra
The document discusses innovative modeling practices to achieve reasonable conservatism in AERMOD modeling demonstrations. It presents a case study evaluating three modeling techniques: EMVAP, which assigns random emission rates over iterations; ARM2, which calculates NOx to NO2 conversion based on plume entrapment; and using the 50th percentile monitored background concentration. The case study found lower modeled concentrations using EMVAP and ARM2 compared to current practices, demonstrating these techniques can provide more realistic results while still protecting air quality standards. Pairing the 98th percentile predicted concentration with the 50th percentile monitored background provided a statistically conservative but reasonable level of conservatism.
On March 15, 33 experts from the Department’s National Nuclear Security Administration (NNSA) arrived in Japan along with more than 17,200 pounds of equipment. After initial deployments at U.S. consulates and military installations in Japan, these teams have utilized their unique skills, expertise and equipment to help assess, survey, monitor and sample areas for radiation. The 33 team members joined another six DOE personnel already in Japan.
Since arriving in Japan, NNSA teams have collected and analyzed data gathered from more than 40 hours of flights aboard Department of Defense aircraft and thousands of ground monitoring points.
2003-10-15 Biomass Smoke Emissions and Transport: Community-based Satellite a...Rudolf Husar
The document discusses biomass smoke emissions and transport patterns in North America as analyzed using satellite and surface data. Key findings include:
- Four main fire zones were identified based on fire size and distribution: Northern, Northwestern, Southeastern, and Mexican.
- Peak fire seasons vary by region, from December to February in Mexico to June to August in Northern Canada and Northwest US.
- Smoke emission and concentration patterns were measured and modeled using various data sources and models. Near-source and distant smoke transport patterns were estimated.
- Characterizing smoke fully requires describing multiple properties including location, time, particle size, composition, shape, and mixtures - a challenge given sparse measurement data.
On March 15, 33 experts from the Department’s National Nuclear Security Administration (NNSA) arrived in Japan along with more than 17,200 pounds of equipment. After initial deployments at U.S. consulates and military installations in Japan, these teams have utilized their unique skills, expertise and equipment to help assess, survey, monitor and sample areas for radiation. The 33 team members joined another six DOE personnel already in Japan.
Using SAR Intensity and Coherence to Detect A Moorland Wildfire ScarGail Millin-Chalabi
This document presents a study that used SAR intensity and coherence to detect a fire scar in a degraded moorland environment in the UK. It describes the methodology, which involved preprocessing SAR data and extracting backscatter values for different land cover classes within the fire scar over time. The results show that precipitation and land cover affected the SAR intensity signal inside the fire scar, with peat bog having the highest returns. InSAR coherence was also analyzed for pairs before and after the fire. The summary concludes that SAR intensity can detect large fire scars but coherence needs more exploration, and recommends investigating different fire scenarios and radar frequencies.
O documento descreve as regras de um jogo de adivinhação de palavras chamado "Forca", no qual as crianças tentam adivinhar uma palavra escolhida pelo professor ou aluno colocando letras, e desenhando partes de uma forca ou boneco se errar, com o objetivo de aprender antes que a forca seja completa.
El efecto óptico Pr4D4 permite manipular la luz de manera dinámica y precisa. Utiliza cristales líquidos para alterar la forma de la luz entrante y crear hologramas en tiempo real. Esto podría usarse para proyectar imágenes 3D sin necesidad de gafas o para mejorar la realidad aumentada.
This document summarizes the issues with the current employment-based green card process and calls on high-skilled immigrants to attend a rally in Washington D.C. to advocate for legislative changes. It notes that the backlogs and per-country caps mean wait times of 15-16 years for Indian and Chinese applicants. All employment-based applicants, regardless of nationality, will likely face severe retrogression of priority dates due to limited visa numbers. The current system forces immigrants to remain in the same job and salary for 5-7 years while they wait, negatively impacting their careers and economic potential. Legislative action is needed to address issues like per-country caps and increasing visa numbers in order to establish a fairer immigration system.
Enriching Scholarship keynote, 2007, University of MichiganBryan Alexander
The document discusses emerging technologies and their potential applications in teaching and learning. It covers topics like Web 2.0, mobile technologies, gaming, and storytelling through new media. Specific applications mentioned include using wikis and blogs for collaborative writing, podcasting and social media for distributing course content, and games/virtual worlds for pedagogical purposes. Concerns about privacy and fears of new technologies are also addressed.
Este documento presenta el Proyecto de Lectura Silenciosa Sostenida 2007. El objetivo del proyecto es promover el hábito lector en los estudiantes a través de la práctica diaria, mejorar sus actitudes hacia la lectura, y fomentar el intercambio. Se implementará mediante la elaboración de fichas de lectura por nivel, con reconocimientos semanales, e involucrando a los apoderados para apoyar el éxito del proyecto.
La invitación invita a Bolivia a una videoconferencia el 7 de septiembre a las 18:00 hora de Bolivia sobre prevención y detección del cáncer cervicouterino. La conferencia tratará sobre la anatomía y fisiología del órgano afectado, factores de riesgo, cuadro clínico, diagnóstico, tratamiento y promoción de la salud. También explicará cómo realizar la citología exfoliativa y quién está capacitado para hacerlo.
2004-09-21 Natural Aerosol Event Detection and CharacterizationRudolf Husar
The document discusses analyzing natural aerosol events such as dust and smoke over the United States. It outlines using multiple data sources and analytical approaches to identify the spatial, temporal, and chemical patterns of natural aerosols. Specific analysis includes identifying the local and intercontinental origins of dust events over the US using monitoring data, satellite images, and air quality models.
2005-04-01 Carbonaceous Aerosol and Smoke over the Eastern USRudolf Husar
1. The document analyzes carbonaceous aerosol and smoke patterns over eastern North America using data from surface and satellite observations.
2. Smoke from biomass burning constitutes a significant component of fine particulate matter over North America, particularly in summer, but its spatial and temporal patterns are not fully understood.
3. The analysis aims to better characterize the sources, distribution, and composition of smoke through an integrated assessment of literature and recent observation data.
2004-09-23 PM Event Detection from Time SeriesRudolf Husar
The document discusses methods for identifying particulate matter (PM) events from time series monitoring data. It describes decomposing the temporal signal into seasonal, random noise, and event components. PM events are identified as spikes above a threshold, such as a percentile of the seasonal values. The analysis shows regional differences in the contribution of these components to total concentration variability and in the composition of identified PM events.
2003-12-04 Evaluation of the ASOS Light Scattering NetworkRudolf Husar
The document reports on an evaluation of the Automated Surface Observing System (ASOS) light scattering network. It analyzes data from 220 ASOS stations to evaluate the precision and performance of the ASOS visibility sensors. It finds that some stations show excellent correlation between duplicate sensors while others show poorer correlation or significant offsets. It also examines diurnal patterns and the effects of relative humidity on visibility readings.
This document summarizes research on using satellite data to analyze aerosols and smoke plumes. It discusses advances in fire detection from satellites and surface smoke detection. However, smoke quantification remains challenging due to issues like cloud interference and variable smoke reflectance. The document proposes developing a collaborative approach using information technology to integrate different data sources like satellites, models, and surface observations to better detect and quantify smoke emissions and predict their impact on air quality and visibility.
The document discusses atmospheric aerosols and their characterization. It notes that aerosols have complex physicochemical properties and their full characterization requires measuring multiple dimensions such as size, composition, shape, and mixing state. Satellite data has helped characterize the global distribution of aerosols but challenges remain in integrating different data sources. Aerosols originate from both natural sources like dust, fires and volcanoes as well as human activities and influence factors like climate, air quality and human health.
2006-03-08 Intercontinental Aerosol Transport: Quantitative Tools an ResultsRudolf Husar
The document discusses various tools and methods for quantifying intercontinental transport of air pollutants like dust and determining their climatic effects. It summarizes that satellite data shows Sahara desert as the largest global dust source. Transport models and analysis of chemical and temporal patterns indicate that fine dust events over the US originate from Saharan dust in summer and Gobi dust in spring. Methods like chemical fingerprinting, back trajectory analysis, and integrated chemistry-transport modeling are used to attribute the origin and quantify transport of different dust sources.
This document summarizes Rudolf Husar's presentation on exceptional event analysis and decision support systems. It discusses using diverse data like satellites, models, and real-time monitoring to evaluate exceptional events like wildfires, dust storms, and their impact on air quality measurements. Specific examples are presented of exceptional events from dust from Asia and Africa impacting North America, as well as wildfires in Georgia impacting ozone and PM2.5 levels. Tools like the Navy Aerosol Analysis and Prediction System model and satellite data are highlighted for their ability to analyze the transport and impact of these aerosol plumes to support regulatory decisions. The goal of reconciliation of emissions, observations, and models is discussed to improve the evaluation of exceptional events
1) Boundary conditions representing air flowing into North America from other regions contribute significantly to uncertainties in atmospheric CO2 mixing ratios, especially at seasonal timescales.
2) Fossil fuel emissions uncertainties are another major source of uncertainty in CO2 mixing ratios when analyzed at annual timescales.
3) Flux tower measurements of ecosystem carbon uptake and atmospheric CO2 concentration data provide consistent results on biases in biogeochemical model simulations, but concentrations cannot fully disentangle diurnal biases identified by flux towers.
Retrieval & monitoring of atmospheric green house gases (gh gs) through remot...debasishagri
- Heavier precipitation and longer/more intense droughts are effects of climate change.
- Satellite measurements of greenhouse gases like CO2, CH4, N2O can provide global coverage but require complex retrieval algorithms to account for atmospheric conditions.
- Newer retrieval techniques like using merged fitting windows have improved accuracy for scenes with thin cirrus clouds compared to earlier techniques.
This document discusses regional and country-scale carbon budgets. It begins by outlining some issues with mesoscale models and uncertainties in regional budgets due to limited observations. It then discusses efforts to close carbon budgets at continental scales and compares bottom-up and top-down estimates, finding large discrepancies. The document presents several case studies of carbon budget modeling and inversion for the Netherlands and Germany, comparing modeled fluxes to aircraft observations. It concludes by discussing reducing uncertainties in carbon budgets by bridging scales from global to regional to local through integrated surface, aircraft and potential satellite observations.
Rudolf B. Husar presented at the EPA on exceptional smoke and dust events. He discussed using diverse data like satellites, models, and real-time data in a decision support system to evaluate these events. The NAAPS aerosol model assimilates satellite data to provide the 3D structure of smoke, dust, and other aerosols. Long-term NAAPS data from 2006 to present show the vertical distribution of different aerosols. Satellite data help reduce biases between surface PM measurements and air quality models.
2005-01-28 Assessment of the Speciated PM Network (Initial Draft, November 2004)Rudolf Husar
This document summarizes the assessment of the speciated particulate matter (PM) monitoring network in the United States. It finds that since 2000, speciated PM monitoring has expanded from 50 sites to over 350 sites. By 2003, the spatial coverage of speciated monitoring was high across the US. For long-term sulfate averages, estimation errors were below 1 microgram per cubic meter in the East. The 350 monitoring sites provide at least 10 times more spatial characterization of PM than the daily sampling frequency alone. The document recommends establishing continuous and automated network assessment as a routine part of ongoing PM monitoring.
2004-11-24 Assessment of the Speciated PM Network (Initial Draft, November 20...Rudolf Husar
This document summarizes the assessment of the speciated particulate matter (PM) monitoring network in the United States. It finds that since 2000, speciated PM monitoring has expanded from 50 sites to over 350 sites. By 2003, the spatial coverage of speciated monitoring was high across the US. For long-term sulfate averages, estimation errors were below 1 microgram/cubic meter in the East. The 350 monitoring sites provide at least 10 times more spatial characterization of PM than the daily sampling frequency alone. The assessment concludes it may be useful to establish continuous and automated network assessment as a routine part of ongoing PM monitoring.
The document provides background information on the COPE field campaign and the goals and instrumentation of the COPE-MED research project. The COPE campaign studied convective storms in southwest England through aircraft and ground-based radar observations. COPE-MED aims to investigate microphysical pathways and entrainment effects on precipitation formation. Key goals are to analyze cloud liquid water content and droplet number concentration measurements from the campaign and examine droplet spectral characteristics during non-precipitating cloud penetrations.
The document discusses three analysis frameworks for integrating energy and environmental issues:
1) The sensory-motor feedback loop models the assessment-control cycle using monitoring data to inform goals and actions.
2) The biogeochemical cycling loop illustrates the circulation of materials like carbon and nitrogen between environmental compartments.
3) The causality loop links human activities, their impacts on the environment, and feedback through economic and social factors.
This document provides information about the Copernicus Climate Change Service (C3S) and Copernicus Atmosphere Monitoring Service (CAMS). It describes how C3S and CAMS use satellite observations, models, and reanalysis to provide data and services related to climate change and atmospheric composition. It also discusses the development of a potential anthropogenic CO2 emissions monitoring system building upon CAMS and C3S modeling capabilities with complementary satellite and in-situ observation components.
The document provides an overview of the Copernicus program, the EU's Earth observation program. It describes the six Copernicus services that deliver products using Earth observation data and models. It highlights some key Sentinel satellite missions, including the recent successful launch of Sentinel-1B and that Sentinel-3A is now in its operational qualification phase. It also briefly describes the services' products and applications in areas like land monitoring, marine environment monitoring, atmosphere monitoring, climate change, and emergency management.
Similar to 0507 Event Analysis 051101 Event Seminar2 (20)
This document discusses the challenges of characterizing air pollution using remote sensing observations over China. It describes the seven dimensions of data - spatial, height, time, particle size, composition, shape, and mixing - needed to fully characterize air pollution. While each individual observation method or data set has limitations, together they can provide consistent global-scale observations. There remain significant challenges to integrating data from multiple sensors to accurately measure air pollution. International collaboration combining global satellite data with detailed local observations in China may help advance progress in addressing this issue.
The document discusses the Air Quality Community of Practice (AQ CoP) which facilitates interoperability and data networking for air quality and health applications. The AQ CoP has developed an open-source Air Quality Data Network (ADN) consisting of 7 interoperable air quality data servers that provide access to diverse observational and model datasets using international standards. The ADN demonstrates GEO principles and infrastructure but requires further development to support real applications. The main role of the AQ CoP is to connect different initiatives and enable the ADN network.
The workshop will bring together practitioners from Europe and North America to discuss progress and challenges in realizing an interoperable air quality data network. Participants will assess the current state of the pilot network, address key technical issues around data standards, server implementation and maintenance, and catalog design. The goal is to advance the network from a virtual concept to an operational reality, facilitating improved access, integration and reuse of air quality observation and model data.
The document describes DataFed, a federated data system that provides non-intrusive integration of diverse environmental datasets using open standards. DataFed allows users to find and access datasets through a catalog and flexible tools for processing and visualizing the data. It facilitates publishing, finding, and accessing geospatial and environmental data through loose coupling of autonomous nodes and OGC web service protocols.
This document discusses the emerging pattern in the air quality information ecosystem. It notes that individual data providers, scientists, and decision supporters are being replaced by groups that facilitate access, sharing, and integration. These include data portals, science teams, and decision support systems. The ecosystem involves multiple stages from observations to decisions, with value added at each stage through activities like data aggregation, scientific collaboration, and predictive analysis. This new structure is more efficient and supports the goals of initiatives like GEOSS.
The document discusses a workshop on networking air quality observations and models to support decision making. The workshop aims to (1) introduce participants and identify shared data and applications, (2) exchange best practices for interoperability, and (3) address technical and collaboration issues. The preliminary agenda covers assessing the current state of air quality interoperability and the technical requirements for improved data sharing and integration to support applications and decision support systems.
The document summarizes the exploration of PM networks and data over the US using two datasets: AQS and VIEWS. It presents information on the coverage and frequency of EPA monitoring data, as well as data from the VIEWS network. It also describes the user interface for the Datafed browser and schemes for processing and aggregating raw monitoring data spatially and temporally. Finally, it analyzes the spatial and temporal variation of PM levels and the correlation between continuous and EPA monitoring data in different regions of the US.
110410 aq user_req_methodology_sydney_submRudolf Husar
This document proposes a methodology to determine user requirements for Earth observations related to air quality management. The methodology is a bottom-up approach that (1) defines the major workflow steps of air quality management, (2) identifies the value-adding activities within each step, (3) determines the participants ("users") for each activity, and (4) establishes the Earth observation needs of each user. The methodology is intended to facilitate ongoing feedback to optimize the value of Earth observations for air quality management and reduce gaps. It provides a systematic way to account for user needs based on the specific activities and users involved in the air quality management process.
This document provides a 2011 progress report for the GEOSS Air Quality Community of Practice (AQ CoP). It summarizes activities undertaken in 2011, including developing an air quality data server software to make data more accessible and interoperable, creating a user requirements registry to identify needed observations and models, and matching user needs with available data through a community catalog. It outlines ongoing projects and plans to further expand the air quality data network through coordination and workshops in 2011. The overall goal is to integrate air quality initiatives and make relevant data more findable, accessible and interoperable to support applications in air quality and health.
The document describes the HTAP Data Network, which demonstrates a service-oriented approach to sharing atmospheric model outputs and air quality observations between various data servers using open standards. The main output is open-source WCS data server software and tools that allow different organizations to publish, find, and access distributed air quality data holdings in a interoperable way as part of the GEO Task DA-09-02d: Atmospheric Model Evaluation Network. The network aims to connect air quality data providers and users to enable effective air quality science and management.
The REASoN Project will link NASA's air quality data, modeling, and systems to users in research, education, and applications. It aims to address hurdles users face in finding, accessing, evaluating, and merging relevant data. The project will utilize service orientation and interoperability standards to build an adaptable information infrastructure. This will include becoming a node on the air quality network, implementing standards for sharing data and tools, and participating in the GEOSS Architecture Implementation Pilot.
This document summarizes the Exceptional Event Decision Support System (EE DSS) which uses NASA satellite data and the Navy Aerosol Analysis and Prediction System (NAAPS) model to help with air quality management decisions regarding exceptional events like smoke and dust events. The EE DSS has been developed since 2005 with NASA support and is now ready to serve air quality management at the federal, regional, and state levels. It can automatically detect and analyze events, display relevant data through interactive maps and cross-sections, and its tools have helped explain declines in exceptional event flags and PM2.5 concentrations from 2006-2012. Coordination is proposed with NASA and EPA for continued application of the EE DSS to smoke and dust events in
This document discusses the usefulness of satellite observations for air quality applications and regulatory requirements. It outlines six key air quality requirements that satellites can help address, such as determining compliance with air quality standards and identifying long-range pollution transport events. The document also notes how satellites can help improve emissions estimates, characterize long-range transport of pollution, and increase interaction between air quality and remote sensing scientists. However, it cautions that relating satellite aerosol optical depth measurements directly to ground-level PM concentrations currently has too much uncertainty for regulatory or public health applications.
The document discusses tools for closing the gap between emissions, observations, and models of air quality. It proposes a service oriented architecture and network to integrate multiple datasets from observations, emissions, and models. This would allow iterative evaluation and improvement of models by comparing them to observations and adjusting emissions estimates to reduce biases. The end goal is to provide the best available composition of the atmosphere by integrating the best observations, emissions estimates, and models.
This proposal outlines a study on the influence of weather and climate events on air quality issues like dust, smoke, and sulfate events. The study would examine these events at both the continental/hemispherical scale and regional scale. At the continental scale, the analysis would demonstrate the role of global climate and emissions and identify tipping points for air quality regulations. At the regional scale, the study would analyze the effects of regional emissions, climate, and precipitation on air quality. The proposal describes tools and methods for conducting continental and regional air quality-climate analysis, including models, datasets, and satellite data. The goals are to support air quality management and identify implications for policy.
The document discusses various applications of air quality data including regulatory exceptions, hemispheric transport projects, and atmospheric composition portals. It also describes the Air Quality Community of Practice's contributions to the GEOSS Common Infrastructure through developing an air quality community catalog and data finder to help users discover and access air quality data and metadata registered in the GEOSS clearinghouse and registry.
The document discusses several air quality applications and projects including regulatory exception events, hemispheric transport modeling, and atmospheric composition portals. It also describes the Air Quality Community of Practice's contributions to the GEOSS Common Infrastructure through developing an air quality catalog and data finder to help users discover and access air quality data and metadata registered in the GEOSS Clearinghouse and Registry.
2004-06-20 Fast Aerosol Sensing Tools for Natural Event Tracking FASTNETRudolf Husar
This document summarizes the FASTNET project which aims to better characterize natural haze conditions through the development of tools to access, archive, and analyze aerosol data. The project focuses on detailed analysis of major natural aerosol events like dust storms and wildfires. Initial efforts demonstrate the feasibility of using data on aerosol composition, transport patterns, and satellite imagery to identify the sources and pathways of dust transported from the Sahara desert to the eastern United States.
2004-06-23 Retrieval of smoke aerosol loading from remote sensing dataRudolf Husar
This document summarizes a method for quantifying biomass burning aerosol loading using remote sensing data. It describes retrieving aerosol optical thickness from satellite imagery by subtracting surface reflectance from total reflectance. Daily aerosol maps are generated and cleaned to filter out clouds and other interferences. Continuing work includes estimating smoke fluxes and fusing multiple data sources to improve quantification of biomass burning for climate modeling.
2004-06-24 Co-retrieval of Aerosol and Surface Reflectance: Analysis of Daily...Rudolf Husar
The document summarizes a method for co-retrieving aerosol and surface reflectance from daily SeaWiFS satellite data from 2000-2002. It describes how aerosols scatter and absorb incoming radiation, obscuring the surface reflectance detected by the sensor. The method uses a time series analysis to identify clear "anchor" days with minimal aerosol scattering to retrieve the surface reflectance. It then uses a radiative transfer model along with the surface reflectance values to iteratively retrieve the aerosol optical thickness and refine the surface reflectance estimates. Results show seasonal changes in surface reflectance over eastern and western US regions.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
"$10 thousand per minute of downtime: architecture, queues, streaming and fin...Fwdays
Direct losses from downtime in 1 minute = $5-$10 thousand dollars. Reputation is priceless.
As part of the talk, we will consider the architectural strategies necessary for the development of highly loaded fintech solutions. We will focus on using queues and streaming to efficiently work and manage large amounts of data in real-time and to minimize latency.
We will focus special attention on the architectural patterns used in the design of the fintech system, microservices and event-driven architecture, which ensure scalability, fault tolerance, and consistency of the entire system.
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
"NATO Hackathon Winner: AI-Powered Drug Search", Taras KlobaFwdays
This is a session that details how PostgreSQL's features and Azure AI Services can be effectively used to significantly enhance the search functionality in any application.
In this session, we'll share insights on how we used PostgreSQL to facilitate precise searches across multiple fields in our mobile application. The techniques include using LIKE and ILIKE operators and integrating a trigram-based search to handle potential misspellings, thereby increasing the search accuracy.
We'll also discuss how the azure_ai extension on PostgreSQL databases in Azure and Azure AI Services were utilized to create vectors from user input, a feature beneficial when users wish to find specific items based on text prompts. While our application's case study involves a drug search, the techniques and principles shared in this session can be adapted to improve search functionality in a wide range of applications. Join us to learn how PostgreSQL and Azure AI can be harnessed to enhance your application's search capability.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
In our second session, we shall learn all about the main features and fundamentals of UiPath Studio that enable us to use the building blocks for any automation project.
📕 Detailed agenda:
Variables and Datatypes
Workflow Layouts
Arguments
Control Flows and Loops
Conditional Statements
💻 Extra training through UiPath Academy:
Variables, Constants, and Arguments in Studio
Control Flow in Studio
"What does it really mean for your system to be available, or how to define w...Fwdays
We will talk about system monitoring from a few different angles. We will start by covering the basics, then discuss SLOs, how to define them, and why understanding the business well is crucial for success in this exercise.
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
From Natural Language to Structured Solr Queries using LLMsSease
This talk draws on experimentation to enable AI applications with Solr. One important use case is to use AI for better accessibility and discoverability of the data: while User eXperience techniques, lexical search improvements, and data harmonization can take organizations to a good level of accessibility, a structural (or “cognitive” gap) remains between the data user needs and the data producer constraints.
That is where AI – and most importantly, Natural Language Processing and Large Language Model techniques – could make a difference. This natural language, conversational engine could facilitate access and usage of the data leveraging the semantics of any data source.
The objective of the presentation is to propose a technical approach and a way forward to achieve this goal.
The key concept is to enable users to express their search queries in natural language, which the LLM then enriches, interprets, and translates into structured queries based on the Solr index’s metadata.
This approach leverages the LLM’s ability to understand the nuances of natural language and the structure of documents within Apache Solr.
The LLM acts as an intermediary agent, offering a transparent experience to users automatically and potentially uncovering relevant documents that conventional search methods might overlook. The presentation will include the results of this experimental work, lessons learned, best practices, and the scope of future work that should improve the approach and make it production-ready.
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
0507 Event Analysis 051101 Event Seminar2
1. Characterization of Aerosol Events using the Federated Data System, DataFed R.B. Husar and S.R. Falke Washington University in St. Louis Presented at EPA – OAQPS Seminar Research Triangle Park, NC, November 1, 2005
2.
3. NAAMS: National Ambient Air Monitoring Strategy and NCore … coordinated multi-pollutant real-time monitoring network
4.
5. Dust, Smoke and Exceptional Events Intercontinental Dust Smoke Event July 4 2004 July 4 2003
6.
7.
8. AIRNOW PM25 - ASOS RH- Corrected Bext July 21, 2004 July 22, 2004 July 23, 2004 ARINOW PM25 ARINOW PM25 ARINOW PM25 ASOS RHBext ASOS RHBext ASOS RHBext
9. Quebec Smoke July 7, 2002 Satellite Optical Depth & Surface ASOS RHBext
10.
11.
12. Bext Distribution Function Albany Sigma g = 3.75 Charlotte Sigma g = 1.56 Upper 20 percentile contribution: Notheast > 50% of dosage Southeast < 30% of dosage 1979
20. Average and 98 Percentile Pattern SO4 PM2.5 Mass PM2.5 Mass OC OC SO4 A V E R A G E 98 Percentile
21.
22.
23.
24.
25.
26. OCSB, OCSoot and PM25 Seasonal Pattern Average over 2000-2004 period PM25Mass OCSB SmokeBio OCSoot Day of Year Mexican Smoke Agricultural Smoke Urban Soot
28. Soot Spatial Pattern Dec Jan Feb Sep Oct Nov Mar Apr May Jun Jul Aug Jun Jul Aug
29. PM2.5 (blue) and SmokeBioMass (red) Note: Smoke events are spikes superimposed on biogenic OC background Smoke Events Kansas Agric. Smoke
30. Example OC Smoke Events Note: Smoke events are spikes superimposed on biogenic OC background Smoke Events
31. GRSM Seasonal Pattern of Percentiles PM25 OC SO4 Soil Episodic Episodic OC in Fall dominates episodicity - Smoke Organics?
32.
33.
34. FASTNET: Inter-RPO pilot project, through NESCAUM, 2004 Web-based data, tools for community use Built on DataFed infra-structure, NSF, NASA Project fate depends on sponsor, user evaluation
35.
36. Conceptual Diagram of an Emissions Community XML GIS Estimation Methods RDBMS Geospatial One-Stop Transport Models Emissions Inventory Catalog Users & Projects Web Tools/Services Emissions Inventories Data Data Catalogs Activity Data Spatial Allocation Comparison of Emissions Methods Data Analysis Model Development Wrappers Emissions Factors Surrogates Report Generation Mediators / Portal
37. North American Commission for Environmental Cooperation Web Application Report http:// www.cec.org /files/PDF/ POLLUTANTS/ PowerPlant_AirEmission_en.pdf http:// webapps.datafed.net/dvoy_services / datafed.aspx?page = PowerPlant_Emissions 2002 North American Powerplant Emissions
38. Spatial-temporal analysis of fire counts http:// webapps.datafed.net/dvoy_services/datafed.aspx?page =Fire_Pixel_Count_AK Large fires during the summer of 2004 in Central Alaska. Spatially aggregated count of fire pixels over a 100km 2 area. The size of each red square in the map is proportional to the number of fire pixels. The spatial aggregation allows the generation of a time series for each aggregated area.
39. BLM Area burned - monthly average The acres burned in the BLM compiled fire history dataset are spatially aggregated on a 50km 2 grid and temporally aggregated to a monthly resolution. Circles are proportional to the acres burned at a location for a particular year and month. Time series plot shows the monthly total number of acres burned at a particular 50km2 area. http:// webapps.datafed.net/dvoy_services/datafed.aspx?page = BLM_AcresBurned
40. Aggregation Tools Fire Pixels June 16-23, 2004 Spatially Aggregated Monthly Sum Spatially Aggregated
41. Spatial-temporal Comparison of fire pixels http://www.datafed.net/WebApps/MiscApps/ModisGoes/FireLocationComparison.htm A red shaded square indicates a short distance separating the MODIS and GOES pixels while a blue shaded square indicates the nearest neighbor between the datasets were far apart . A red outlined square indicates the nearest neighbor was detected on the same day while a blue outlined square indicates a longer time separation . Gray shaded and/or outlined squares indicate that a nearest neighbor was not found between the two datasets given the search parameters (in this example case, 100 km and 2 days).
42. Standards Based Data Sharing Open Geospatial Specifications (OGC) for web mapping Web Map Service (images) Web Feature Service (point/vector data) Web Coverage Service (gridded data) Geospatial One-Stop – The National Map DataFed-OGC Description: http:// www.datafed.net/DataLinks/OGC/OGC.htm http://webapps.datafed.net/dvoy_services/ogc_domain_fire.wsfl?SERVICE=WMS&VERSION=1.1.1&REQUEST=GetCapabilities DataFed OGC WMS for fire data: