Use of Wind Tunnel Refinements in the Dispersion Modeling Analysis of the Ala...Sergio A. Guerra
The proposed Alaska LNG GTP project includes the construction of a natural gas treatment plant on the Alaska North Slope. The Gas Treatment Plant (GTP) is proposed to be located on the west coast of Prudhoe Bay and would treat natural gas produced on the North Slope.
Initial dispersion modeling of the Alaska LNG Gas Treatment Plant (GTP) found results inconsistent with local and regional measurements when evaluating compliance with the 1-hour NO2 National Ambient Air Quality Standard (NAAQS) due in part to two adjacent nearby sources. These existing sources include the Central Gas Facility (CGF) and Central Compression Plant (CCP) located immediately east of the GTP. The prevailing winds at the site are east-northeast and west-southwest which align with the arrangement of the facilities.
The building downwash inputs generated by the Building Profile Input Program for PRIME (BPIPPRM) were evaluated for the CGF and CCP facilities. This analysis confirmed that the building dimension inputs for numerous wind directions were outside of the tested theory used to develop the building downwash algorithms in AERMOD. Previous studies2,8,11,12,13 suggest that AERMOD predictions are biased to overstate downwash effects for certain building input ratios.
Wind tunnel determined equivalent building dimensions (EBD) were conducted for the most critical stacks and wind directions to refine AERMOD-derived predicted concentrations. The current paper covers the EBD method used to refine the building inputs for the CGF and CCP facilities. The regulatory process and benefits from this physical modeling method is also discussed.
C6.05: New ocean-colour products for the user community - Shubha Sathyendrana...Blue Planet Symposium
The ocean-colour component of the Climate Change Initiative of the European Space Agency has generated a time series of bio-optical products from late 1997 to mid 2012. The products are based on data from SeaWiFS, MODIS-A and MERIS sensors, band shifted (to bring data to a common set of wavebands), corrected for inter-sensor bias, and then merged. Products include remote-sensing reflectances at SeaWiFS wavelengths, chlorophyll concentration, diffuse attenuation coefficient at 490 nm, and inherent optical properties (components of absorption and back-scattering coefficients). Practically all the products have uncertainties (root-mean-square difference and bias) associated with them on a pixel-by-pixel basis, based on validation using in situ data. The first version of the products are available freely at www.oceancolour.org and at www.esa-oceancolour-cci.org. A second version is expected to be released prior to the Blue Planet Symposium in Australia in 2015. Furthermore, plans are underway to add to the product suite through a number of related ESA projects. New products envisaged include primary production, photosynthesis parameters, components of the carbon pool in the ocean and photosynthetically active radiation (PAR) at the sea surface. User consultation and serving the user community are very much a part of these projects, and the Blue Planet provides a useful forum for reaching users from a variety of backgrounds. The work reported here contribute to components C2 (Sustained Ecosystems and Food Security) and C5 (Ocean Climate and Carbon) of the “Oceans and Society: Blue Planet” initiative of the Group on Earth Observations (GEO).
Use of Wind Tunnel Refinements in the Dispersion Modeling Analysis of the Ala...Sergio A. Guerra
The proposed Alaska LNG GTP project includes the construction of a natural gas treatment plant on the Alaska North Slope. The Gas Treatment Plant (GTP) is proposed to be located on the west coast of Prudhoe Bay and would treat natural gas produced on the North Slope.
Initial dispersion modeling of the Alaska LNG Gas Treatment Plant (GTP) found results inconsistent with local and regional measurements when evaluating compliance with the 1-hour NO2 National Ambient Air Quality Standard (NAAQS) due in part to two adjacent nearby sources. These existing sources include the Central Gas Facility (CGF) and Central Compression Plant (CCP) located immediately east of the GTP. The prevailing winds at the site are east-northeast and west-southwest which align with the arrangement of the facilities.
The building downwash inputs generated by the Building Profile Input Program for PRIME (BPIPPRM) were evaluated for the CGF and CCP facilities. This analysis confirmed that the building dimension inputs for numerous wind directions were outside of the tested theory used to develop the building downwash algorithms in AERMOD. Previous studies2,8,11,12,13 suggest that AERMOD predictions are biased to overstate downwash effects for certain building input ratios.
Wind tunnel determined equivalent building dimensions (EBD) were conducted for the most critical stacks and wind directions to refine AERMOD-derived predicted concentrations. The current paper covers the EBD method used to refine the building inputs for the CGF and CCP facilities. The regulatory process and benefits from this physical modeling method is also discussed.
C6.05: New ocean-colour products for the user community - Shubha Sathyendrana...Blue Planet Symposium
The ocean-colour component of the Climate Change Initiative of the European Space Agency has generated a time series of bio-optical products from late 1997 to mid 2012. The products are based on data from SeaWiFS, MODIS-A and MERIS sensors, band shifted (to bring data to a common set of wavebands), corrected for inter-sensor bias, and then merged. Products include remote-sensing reflectances at SeaWiFS wavelengths, chlorophyll concentration, diffuse attenuation coefficient at 490 nm, and inherent optical properties (components of absorption and back-scattering coefficients). Practically all the products have uncertainties (root-mean-square difference and bias) associated with them on a pixel-by-pixel basis, based on validation using in situ data. The first version of the products are available freely at www.oceancolour.org and at www.esa-oceancolour-cci.org. A second version is expected to be released prior to the Blue Planet Symposium in Australia in 2015. Furthermore, plans are underway to add to the product suite through a number of related ESA projects. New products envisaged include primary production, photosynthesis parameters, components of the carbon pool in the ocean and photosynthetically active radiation (PAR) at the sea surface. User consultation and serving the user community are very much a part of these projects, and the Blue Planet provides a useful forum for reaching users from a variety of backgrounds. The work reported here contribute to components C2 (Sustained Ecosystems and Food Security) and C5 (Ocean Climate and Carbon) of the “Oceans and Society: Blue Planet” initiative of the Group on Earth Observations (GEO).
Avoid Air-rors! Discuss the Air Regulations that Impact Oil and Gas DevelopmentTrihydro Corporation
Presentation about the air regulations affecting oil and gas development. Topics covered include NSPS OOOO, Leak Detection and Repair, Greenhouse Gas Inventory/Reporting, Optical Gas Imaging with Infrared Cameras
NOVEL DATA ANALYSIS TECHNIQUE USED TO EVALUATE NOX AND CO2 CONTINUOUS EMISSIO...Sergio A. Guerra
The current study presents a new data analysis technique developed while evaluating continuous emission data collected from a trash compactor. The evaluation involved tailpipe sampling with a portable emission monitoring system (PEMS) from a diesel fueled 525-horsepower trash
compactor. The sampling campaign took place by running the compactor with regular no. 2 diesel, B20 and ULSD fuels. The purpose was to determine the possible emission reductions in nitrous oxides (NOx) and carbon dioxide (CO2) from the use of B20 and ULSD in an off-road
vehicle. The results from the NOx analysis are discussed.
The initial data analysis identified two important issues. The first concern related to a bias in the calculated F values due to the very large number of samples (N). The large N influenced the probability values and indicated a false statistical significance for all factors tested. Additionally,
the data observations were found to be highly autocorrelated. Thus, a time interval data reduction
technique was used to address these two statistical limitations to the robustness of the statistical
analyses. The result in each case was a subset of quasi-independent observations sampled at an interval of 800 seconds. The autocorrelation and false statistical significance issues were promptly resolved by using this technique. Since the issues of false statistical significance and autocorrelation are inherent in continuous data, the positive results obtained from the use of this technique can be far-reaching. This technique allowed for a valid use of the general linear model (GLM) with engine speed as the covariate factor to test day, fuel type and compactor factors. This technique is most relevant given the advancements in data collection capabilities that
require data handling techniques to satisfy the statistical assumptions necessary for valid analyses to ensue.
Backscatter Working Group Software Inter-comparison ProjectRequesting and Co...Giuseppe Masetti
Backscatter mosaics of the seafloor are now routinely produced from multibeam sonar data, and used in a wide range of marine applications. However, significant differences (up to 5 dB) have been observed between the levels of mosaics produced by different software processing a same dataset. This is a major detriment to several possible uses of backscatter mosaics, including quantitative analysis, monitoring seafloor change over time, and combining mosaics. A recently concluded international Backscatter Working Group (BSWG) identified this issue and recommended that “to check the consistency of the processing results provided by various software suites, initiatives promoting comparative tests on common data sets should be encouraged […]”. However, backscatter data processing is a complex (and often proprietary) sequence of steps, so that simply comparing end-results between software does not provide much information as to the root cause of the differences between results.
In order to pinpoint the source(s) of inconsistency between software, it is necessary to understand at which stage(s) of the data processing chain do the differences become substantial. We have invited willing software developers to discuss this framework and collectively adopt a list of intermediate processing steps. We provided a small dataset consisting of various seafloor types surveyed with the same multibeam sonar system, using constant acquisition settings and sea conditions, and have the software developers generate these intermediate processing results, to be eventually compared. If the experiment proves fruitful, we may extend it to more datasets, software and intermediate results. Eventually, software developers may consider making the results from intermediate stages a standard output as well as adhering to a consistent terminology, as advocated by Schimel et al. (2018). To date, the developers of four software (Sonarscope, QPS FMGT, CARIS SIPS, MB Process) have expressed their interest in collaborating on this project.
Hollow earth, contrails & global warming calculations lectureMarcus 2012
http://marcusvannini2012.blogspot.com/
http://www.marcusmoon2022.org/designcontest.htm
Shoot for the moon and if you miss you'll land among the stars...
Avoid Air-rors! Discuss the Air Regulations that Impact Oil and Gas DevelopmentTrihydro Corporation
Presentation about the air regulations affecting oil and gas development. Topics covered include NSPS OOOO, Leak Detection and Repair, Greenhouse Gas Inventory/Reporting, Optical Gas Imaging with Infrared Cameras
NOVEL DATA ANALYSIS TECHNIQUE USED TO EVALUATE NOX AND CO2 CONTINUOUS EMISSIO...Sergio A. Guerra
The current study presents a new data analysis technique developed while evaluating continuous emission data collected from a trash compactor. The evaluation involved tailpipe sampling with a portable emission monitoring system (PEMS) from a diesel fueled 525-horsepower trash
compactor. The sampling campaign took place by running the compactor with regular no. 2 diesel, B20 and ULSD fuels. The purpose was to determine the possible emission reductions in nitrous oxides (NOx) and carbon dioxide (CO2) from the use of B20 and ULSD in an off-road
vehicle. The results from the NOx analysis are discussed.
The initial data analysis identified two important issues. The first concern related to a bias in the calculated F values due to the very large number of samples (N). The large N influenced the probability values and indicated a false statistical significance for all factors tested. Additionally,
the data observations were found to be highly autocorrelated. Thus, a time interval data reduction
technique was used to address these two statistical limitations to the robustness of the statistical
analyses. The result in each case was a subset of quasi-independent observations sampled at an interval of 800 seconds. The autocorrelation and false statistical significance issues were promptly resolved by using this technique. Since the issues of false statistical significance and autocorrelation are inherent in continuous data, the positive results obtained from the use of this technique can be far-reaching. This technique allowed for a valid use of the general linear model (GLM) with engine speed as the covariate factor to test day, fuel type and compactor factors. This technique is most relevant given the advancements in data collection capabilities that
require data handling techniques to satisfy the statistical assumptions necessary for valid analyses to ensue.
Backscatter Working Group Software Inter-comparison ProjectRequesting and Co...Giuseppe Masetti
Backscatter mosaics of the seafloor are now routinely produced from multibeam sonar data, and used in a wide range of marine applications. However, significant differences (up to 5 dB) have been observed between the levels of mosaics produced by different software processing a same dataset. This is a major detriment to several possible uses of backscatter mosaics, including quantitative analysis, monitoring seafloor change over time, and combining mosaics. A recently concluded international Backscatter Working Group (BSWG) identified this issue and recommended that “to check the consistency of the processing results provided by various software suites, initiatives promoting comparative tests on common data sets should be encouraged […]”. However, backscatter data processing is a complex (and often proprietary) sequence of steps, so that simply comparing end-results between software does not provide much information as to the root cause of the differences between results.
In order to pinpoint the source(s) of inconsistency between software, it is necessary to understand at which stage(s) of the data processing chain do the differences become substantial. We have invited willing software developers to discuss this framework and collectively adopt a list of intermediate processing steps. We provided a small dataset consisting of various seafloor types surveyed with the same multibeam sonar system, using constant acquisition settings and sea conditions, and have the software developers generate these intermediate processing results, to be eventually compared. If the experiment proves fruitful, we may extend it to more datasets, software and intermediate results. Eventually, software developers may consider making the results from intermediate stages a standard output as well as adhering to a consistent terminology, as advocated by Schimel et al. (2018). To date, the developers of four software (Sonarscope, QPS FMGT, CARIS SIPS, MB Process) have expressed their interest in collaborating on this project.
Hollow earth, contrails & global warming calculations lectureMarcus 2012
http://marcusvannini2012.blogspot.com/
http://www.marcusmoon2022.org/designcontest.htm
Shoot for the moon and if you miss you'll land among the stars...
Space Environment & It's Effects On Space Systems course samplerJim Jenkins
This class on the space environment and its effects on space systems is for technical and management personnel who wish to gain an understanding of the important issues that must be addressed in the development of space instrumentation, subsystems, and systems. The goal is to assist students to achieve their professional potential by endowing them with an understanding of the fundamentals of the space environment and its effects. The class is designed for participants who expect to either, plan, design, build, integrate, test, launch, operate or manage payloads, subsystems, launch vehicles, spacecraft, or ground systems.
Each participant will receive a copy of the reference textbook: Pisacane, VL. The Space Environment and its Effects on Space Systems. AIAA Education Series, 2008.
El 29 de febrero y el 1 de marzo de 2016, la Fundación Ramón Areces analizó la relación entre 'Big Data y el cambio climático' en unas jornadas. ¿Puede el Big Data ayudar a reducir el cambio climático? ¿Cómo contribuirá ese análisis masivo de datos a prevenir y gestionar catástrofes naturales? Son solo algunas de las preguntas a las que intentarán responder los ponentes. Las ciencias vinculadas al clima tienen en el Big Data una herramienta muy prometedora para afrontar diferentes fenómenos asociados al cambio climático.
The Earth System Grid Federation (ESGF) is a large international collaboration that operates a global infrastructure for management and access of Earth System data. Some of the most valuable data collections served by ESGF include the output of global climate models used for the IPCC reports on climate change (CMIP3, CMIP5 and the upcoming CMIP6), regional climate model output (CORDEX), and observational data from several American and European agencies (Obs4MIPs). This talk will present a brief introduction to ESGF, describe the data access and analysis methods currently available or planned for the future, and conclude with some ideas on how this infrastructure could be used as a testbed for executing distributed analytics on a global scale.
Remote sensing –Beyond images
Mexico 14-15 December 2013
The workshop was organized by CIMMYT Global Conservation Agriculture Program (GCAP) and funded by the Bill & Melinda Gates Foundation (BMGF), the Mexican Secretariat of Agriculture, Livestock, Rural Development, Fisheries and Food (SAGARPA), the International Maize and Wheat Improvement Center (CIMMYT), CGIAR Research Program on Maize, the Cereal System Initiative for South Asia (CSISA) and the Sustainable Modernization of the Traditional Agriculture (MasAgro)
Life Cycle Assessment (LCA) del progetto Green SiteeAmbiente
Intervento di Federico Balzan, eAmbiente Srl
Conferenza Finale Progetto GREEN SITE: “Supercritical fluid technologies for river and sea dredge sediment remediation”. LIFE 10 ENV/IT/343.
Venezia, 13 dicembre 2013
Development and validation of a rapid urban scale dispersion modelling platform Scott Hamilton
Here we describe the development and of a new dispersion model (RapidAir®, Ricardo-AEA Ltd) designed as a decision support platform, and describe a recent validation exercise
in London, UK which was carried out by Strathclyde University, UK. Ricardo’s RapidAir model comprises several libraries written in the python programming language with functionality
specific to air quality analysis (e.g. handling time series observation data, array based processing of road emissions).
Development and validation of a rapid urban scale dispersion modelling platformScott Hamilton
Presented at the 16th Annual CMAS Conference, Chapel Hill, NC, October 23-25, 2017.
Authors:
Scott L. Hamilton*. Ricardo Energy and Environment, UK
Nicola Masey, Iain Beverland. University of Strathclyde, UK
DSD-SEA 2023 Global to local multi-hazard forecasting - YanDeltares
Presentation by Kun Yan (Deltares) at the Seminar Models and decision-making in the wake of climate uncertainties, during the Deltares Software Days South-East Asia 2023. Wednesday, 22 February 2023, Singapore.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
9. Description of GLASS albedo preliminary product GLASS (Global LAndSurface Satellite)project: To provide land surface parameter datasets with high resolution (sponsored by Chinese “863” programme) Parameters including: Albedo Emissivity(8-day, 1km) LAI(8-day, 1km) PAR(3-hour, 5km) GLASS preliminary albedo data set characteristics: Algorithm: AB (Angular Bin) algorithm (Liang et al, 2005; Qu et al, 2011) Resolution:1km, 1-day Projection: Sinusoidal Data format: HDF-EOS
10. Description of GLASS albedo preliminary product GLASS albedo preliminary product deficiencies: Frequent data gaps caused by: Cloud coverage Seasonal snow Sharp fluctuations in time series caused by: Data noise Uncertainty of AB inversion algorithm Temporal filtering algorithm objective: To fill in data gaps To smooth the albedo time series
13. Temporal filtering algorithm- Basic idea Basic idea: Firstly, based on the temporal correlation of albedo measurements between neighboring days, it is reasonable to assume that the albedo values between neighboring days are linearly related. Then based on the Bayesian theory, it is possible to predict the true albedo with the neighboring days’ AB albedo retrievals.
14. Temporal filtering algorithm- Basic idea Multi-day AB albedo products Multi-year global albedo products Global albedo a-priori statistics Build linear model Temporal filtering GLASS albedo
15. Temporal filtering algorithm- Temporal filtering formula Temporal filtering algorithm is a weighted average of neighboring days’ albedo! Derived from global a-priori statistics
16. Temporal filtering algorithm- Global albedo a-priori statistics Data Set used: MODIS albedo products(MCD43B3, 2000-2009) The same inputs as AB algorithm (MOD09) Stability Statistics include: Multi-year average and variance Correlation coefficients of albedo between two neighboring days Resolution: 5km, 8-days
17. Temporal filtering algorithm- Global albedo a-priori statistics Calculate regression coefficients with background filed Albedo a-priori statistics
21. Temporal filtering algorithm- conclusion Table1: Validation results of temporal filtering algorithm AAD: Average Absolute Deviation; AAD1: AAD between GLASS albedo and temporal algorithm results; AAD2: AAD between ground measured albedo and temporal algorithm results; AAD3: AAD between ground measured albedo and GLASS albedo and temporal algorithm results;
22. Temporal filtering algorithm- conclusion The temporal correlation of neighboring day’s albedo is considered in the TF method; Temporal filtering algorithm is an weighted average of neighboring days’ albedo values; TF method can fill in data gaps and smooth albedo series; TF method sometimes will smooth the albedo series overly; Further validations are required;