Dengue Vector Population Forecasting Using Multisource Earth Observation Products and Recurrent Neural Networks (Presented at a seminar to students of the Chinese University of Petroleum, 31:06:2021)
The document describes a methodology for forecasting dengue vector populations one week ahead using multi-source earth observation data and recurrent neural networks. The methodology clusters mosquito population ground truth data using k-means clustering. It then trains an encoder-decoder LSTM neural network on time series data of earth observation features means for each cluster. The model is tested against random forest and k-nearest neighbor models on data from two locations in Brazil. Results show the LSTM model more accurately follows the highest and lowest observed mosquito population values compared to the other models.
The melting of the West Antarctic ice sheet (WAIS) is likely to cause a significant rise in sea levels. Studying the present state of WAIS and predicting its future behavior involves the use of computer models of ice sheet dynamics as well as observational data. I will outline general statistical challenges posed by these scientific questions and data sets.
This discussion is based on joint work with Yawen Guan (Penn State/SAMSI), Won Chang (U. of Cincinnati), Patrick Applegate, David Pollard (Penn State)
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
The melting of the West Antarctic ice sheet (WAIS) is likely to cause a significant rise in sea levels. Studying the present state of WAIS and predicting its future behavior involves the use of computer models of ice sheet dynamics as well as observational data. I will outline general statistical challenges posed by these scientific questions and data sets.
This discussion is based on joint work with Yawen Guan (Penn State/SAMSI), Won Chang (U. of Cincinnati), Patrick Applegate, David Pollard (Penn State)
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
Over a seven day period in August 2017 Hurricane Harvey brought extreme levels of rainfall to the Houston area, resulting in catastrophic flooding that caused loss of human life and damage to personal property and public infrastructure. In the wake of this event, there is growing interest in understanding the degree to which this event was unusual and estimating the probability of experiencing a similar event in other locations. Additionally, we investigate the degree to which the sea surface temperature in the Gulf of Mexico is associated with extreme precipitation in the US Gulf Coast. This talk addresses these issues through the development of an extreme value model.
We assume that the annual maximum precipitation values at Gulf Coast locations approximately follow the Generalized Extreme Value (GEV) distribution. Because the observed precipitation record in this region is relatively short, we borrow strength across spatial locations to improve GEV parameter estimates. We model the GEV parameters at US Gulf Coast locations using a multivariate spatial hierarchical model; for inference, a two-stage approach is utilized. Spatial
interpolation is used to estimate GEV parameters at unobserved locations, allowing us to characterize precipitation extremes throughout the region. Analysis indicates that Harvey was highly unusual as a seven
-day event, and that GoM SST seems to be more strongly linked to extreme precipitation in the Western part of
the region.
Climate model parameterizations of cumulus convection and other clouds that form due to small-scale turbulent eddies are a leading source of uncertainty in predicting the sensitivity of global warming to greenhouse gas increases. Even though we can write down equations governing the physics of cloud formation and fluid motion, these cloud-forming eddies are not resolved by the grid of a climate model, so the subgrid covariability of cloud processes and turbulence must be parameterized. Many approaches are used, all involving numerous subjective assumptions. Even when optimized to match present-day climate, these approaches produce a broad range of predictions about how clouds will change in a future climate.
High resolution models which explicitly simulate the clouds and turbulence on a very fine computational grid more realistically simulate cloud formation compared to observations. But it has proved challenging to translate this skill into better climate model parameterizations.
We will present one naturally stochastic approach for this using a computationally expensive approach called ‘superparameterization’ and then we will lay out a vision for how machine learning could be used to do this translation, which amounts to a form of stochastic coarse-graining. Developing the statistical and computational methods to realize this vision is a good challenge for this SAMSI year.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
Over a seven day period in August 2017 Hurricane Harvey brought extreme levels of rainfall to the Houston area, resulting in catastrophic flooding that caused loss of human life and damage to personal property and public infrastructure. In the wake of this event, there is growing interest in understanding the degree to which this event was unusual and estimating the probability of experiencing a similar event in other locations. Additionally, we investigate the degree to which the sea surface temperature in the Gulf of Mexico is associated with extreme precipitation in the US Gulf Coast. This talk addresses these issues through the development of an extreme value model.
We assume that the annual maximum precipitation values at Gulf Coast locations approximately follow the Generalized Extreme Value (GEV) distribution. Because the observed precipitation record in this region is relatively short, we borrow strength across spatial locations to improve GEV parameter estimates. We model the GEV parameters at US Gulf Coast locations using a multivariate spatial hierarchical model; for inference, a two-stage approach is utilized. Spatial
interpolation is used to estimate GEV parameters at unobserved locations, allowing us to characterize precipitation extremes throughout the region. Analysis indicates that Harvey was highly unusual as a seven
-day event, and that GoM SST seems to be more strongly linked to extreme precipitation in the Western part of
the region.
Climate model parameterizations of cumulus convection and other clouds that form due to small-scale turbulent eddies are a leading source of uncertainty in predicting the sensitivity of global warming to greenhouse gas increases. Even though we can write down equations governing the physics of cloud formation and fluid motion, these cloud-forming eddies are not resolved by the grid of a climate model, so the subgrid covariability of cloud processes and turbulence must be parameterized. Many approaches are used, all involving numerous subjective assumptions. Even when optimized to match present-day climate, these approaches produce a broad range of predictions about how clouds will change in a future climate.
High resolution models which explicitly simulate the clouds and turbulence on a very fine computational grid more realistically simulate cloud formation compared to observations. But it has proved challenging to translate this skill into better climate model parameterizations.
We will present one naturally stochastic approach for this using a computationally expensive approach called ‘superparameterization’ and then we will lay out a vision for how machine learning could be used to do this translation, which amounts to a form of stochastic coarse-graining. Developing the statistical and computational methods to realize this vision is a good challenge for this SAMSI year.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
Similar to Dengue Vector Population Forecasting Using Multisource Earth Observation Products and Recurrent Neural Networks (Presented at a seminar to students of the Chinese University of Petroleum, 31:06:2021)
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
Finding Meaning in Points, Areas and Surfaces: Spatial Analysis in RRevolution Analytics
Everything happens somewhere and spatial analysis attempts to use location as an explanatory variable. Such analysis is made complex by the very many ways we habitually record spatial location, the complexity of spatial data structures, and the wide variety of possible domain-driven questions we might ask. One option is to develop and use software for specific types of spatial data, another is to use a purpose-built geographical information system (GIS), but determined work by R enthusiasts has resulted in a multiplicity of packages in the R environment that can also be used.
Deriving environmental indicators from massive spatial time series using open...Markus Neteler
Geospatial Analytics Forum at North Carolina State University, 4 Sept 2014 - http://geospatial.ncsu.edu/about/geoforum/
See also: http://opensource.com/education/14/9/back-school-grass-gis
ESA SMOS (Soil Moisture and Ocean Salinity) Mission: Principles of Operation ...adrianocamps
SMOS basic principles and description of some products developed at the SMOS Barcelona Expert Center.
Disclaimer: these materials were prepared for Eduacational purposes only.
Assessing the impact of a health intervention via user-generated Internet con...Vasileios Lampos
Assessing the effect of a health-oriented intervention by traditional epidemiological methods is commonly based only on population segments that use healthcare services. Here we introduce a complementary framework for evaluating the impact of a targeted intervention, such as a vaccination campaign against an infectious disease, through a statistical analysis of user-generated content submitted on web platforms. Using supervised learning, we derive a nonlinear regression model for estimating the prevalence of a health event in a population from Internet data. This model is applied to identify control location groups that correlate historically with the areas, where a specific intervention campaign has taken place. We then determine the impact of the intervention by inferring a projection of the disease rates that could have emerged in the absence of a campaign. Our case study focuses on the influenza vaccination program that was launched in England during the 2013/14 season, and our observations consist of millions of geo-located search queries to the Bing search engine and posts on Twitter. The impact estimates derived from the application of the proposed statistical framework support conventional assessments of the campaign.
Exploring temporal graph data with Python: a study on tensor decomposition o...André Panisson
Tensor decompositions have gained a steadily increasing popularity in data mining applications. Data sources from sensor networks and Internet-of-Things applications promise a wealth of interaction data that can be naturally represented as multidimensional structures such as tensors. For example, time-varying social networks collected from wearable proximity sensors can be represented as 3-way tensors. By representing this data as tensors, we can use tensor decomposition to extract community structures with their structural and temporal signatures.
The current standard framework for working with tensors, however, is Matlab. We will show how tensor decompositions can be carried out using Python, how to obtain latent components and how they can be interpreted, and what are some applications of this technique in the academy and industry. We will see a use case where a Python implementation of tensor decomposition is applied to a dataset that describes social interactions of people, collected using the SocioPatterns platform. This platform was deployed in different settings such as conferences, schools and hospitals, in order to support mathematical modelling and simulation of airborne infectious diseases. Tensor decomposition has been used in these scenarios to solve different types of problems: it can be used for data cleaning, where time-varying graph anomalies can be identified and removed from data; it can also be used to assess the impact of latent components in the spreading of a disease, and to devise intervention strategies that are able to reduce the number of infection cases in a school or hospital. These are just a few examples that show the potential of this technique in data mining and machine learning applications.
Similar to Dengue Vector Population Forecasting Using Multisource Earth Observation Products and Recurrent Neural Networks (Presented at a seminar to students of the Chinese University of Petroleum, 31:06:2021) (20)
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
Water billing management system project report.pdfKamal Acharya
Our project entitled “Water Billing Management System” aims is to generate Water bill with all the charges and penalty. Manual system that is employed is extremely laborious and quite inadequate. It only makes the process more difficult and hard.
The aim of our project is to develop a system that is meant to partially computerize the work performed in the Water Board like generating monthly Water bill, record of consuming unit of water, store record of the customer and previous unpaid record.
We used HTML/PHP as front end and MYSQL as back end for developing our project. HTML is primarily a visual design environment. We can create a android application by designing the form and that make up the user interface. Adding android application code to the form and the objects such as buttons and text boxes on them and adding any required support code in additional modular.
MySQL is free open source database that facilitates the effective management of the databases by connecting them to the software. It is a stable ,reliable and the powerful solution with the advanced features and advantages which are as follows: Data Security.MySQL is free open source database that facilitates the effective management of the databases by connecting them to the software.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSveerababupersonal22
It consists of cw radar and fmcw radar ,range measurement,if amplifier and fmcw altimeterThe CW radar operates using continuous wave transmission, while the FMCW radar employs frequency-modulated continuous wave technology. Range measurement is a crucial aspect of radar systems, providing information about the distance to a target. The IF amplifier plays a key role in signal processing, amplifying intermediate frequency signals for further analysis. The FMCW altimeter utilizes frequency-modulated continuous wave technology to accurately measure altitude above a reference point.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...ssuser7dcef0
Power plants release a large amount of water vapor into the
atmosphere through the stack. The flue gas can be a potential
source for obtaining much needed cooling water for a power
plant. If a power plant could recover and reuse a portion of this
moisture, it could reduce its total cooling water intake
requirement. One of the most practical way to recover water
from flue gas is to use a condensing heat exchanger. The power
plant could also recover latent heat due to condensation as well
as sensible heat due to lowering the flue gas exit temperature.
Additionally, harmful acids released from the stack can be
reduced in a condensing heat exchanger by acid condensation. reduced in a condensing heat exchanger by acid condensation.
Condensation of vapors in flue gas is a complicated
phenomenon since heat and mass transfer of water vapor and
various acids simultaneously occur in the presence of noncondensable
gases such as nitrogen and oxygen. Design of a
condenser depends on the knowledge and understanding of the
heat and mass transfer processes. A computer program for
numerical simulations of water (H2O) and sulfuric acid (H2SO4)
condensation in a flue gas condensing heat exchanger was
developed using MATLAB. Governing equations based on
mass and energy balances for the system were derived to
predict variables such as flue gas exit temperature, cooling
water outlet temperature, mole fraction and condensation rates
of water and sulfuric acid vapors. The equations were solved
using an iterative solution technique with calculations of heat
and mass transfer coefficients and physical properties.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
The Internet of Things (IoT) is a revolutionary concept that connects everyday objects and devices to the internet, enabling them to communicate, collect, and exchange data. Imagine a world where your refrigerator notifies you when you’re running low on groceries, or streetlights adjust their brightness based on traffic patterns – that’s the power of IoT. In essence, IoT transforms ordinary objects into smart, interconnected devices, creating a network of endless possibilities.
Here is a blog on the role of electrical and electronics engineers in IOT. Let's dig in!!!!
For more such content visit: https://nttftrg.com/
The Role of Electrical and Electronics Engineers in IOT Technology.pdf
Dengue Vector Population Forecasting Using Multisource Earth Observation Products and Recurrent Neural Networks (Presented at a seminar to students of the Chinese University of Petroleum, 31:06:2021)
1. DENGUE VECTOR POPULATION FORECASTING USING
MULTI-SOURCE EARTH OBSERVATION PRODUCTS
AND RECURRENT NEURAL NETWORKS (RNN).
2. Mudele, O., Frery, A. C., Zanandrez, L. F., Eiras, A. E., & Gamba, P.
(2021). Dengue Vector Population Forecasting Using Multisource
Earth Observation Products and Recurrent Neural Networks. IEEE
Journal of Selected Topics in Applied Earth Observations and Remote
Sensing, 14, 4390-4404.
Reference literature
4. • Haphazard urban expansion and industrialization affect
obtainable quality of life.
• Landscape epidemiology explores the relationship between
human health and urban environments.
• Spaceborne/Earth observation (EO) data provide perspectives
on urban area changes and their effects.
Figure: A tale of two different stories in the same city - Rio De Janeiro, Brazil (Source: Google images)
Urbanisation and Epidemiology
5. • Global multiband Earth observation between 0.4 µm—15 µm
wavelengths.
• Applications include urban targets detection, land cover
mapping, urban ecology analysis (e.g epidemiological processes)
• Multispectral: sparsely sampled bands (up to about 12);
Hyperspectral: densely sampled bands (up to about 220).
• Advantages of multispectral optical EO data include:
• less number of bands compared to hyperspectral —
meaning lower compute overhead;
• Possibility to extract earth surface information using
easy-to-compute normalised indices and thermal
infrared bands.
Urban remote sensing with optical EO data
6. • Improved EO data input (spatial and temporal) resolutions.
• Improved bespoke modeling and features extraction methods.
Modeling epidemiological risks with multispectral data
7. • Sample vector: Ae. aegypti mosquito species.
• Diseases carried: Zika, Dengue, Chikungunya viruses.
• Presence: over 100 countries.
• Informative environmental effects: Vegetation condition,
temperature, humidity, precipitation.
• Environmental effects can be extracted across the world from
EO data to derive models.
Specific application: Mosquito diseases risks modeling
8. Mission Available No of Spatial Temporal Thermal Free?
Since bands resolution (m) resolution (days) Infrared?
Landsat July 1972 8 30 16 Yes Yes
SPOT a Feb. 1986 6 (max) 1—4 No No
AVHRR b May 1998 6 1100 1 Yes Yes
MODIS c Dec. 1999 34 250 (max) 1 Yes Yes
GPM/TRMM d Nov. 1997 34 0.1° hourly - Yes
a
SPOT: Systeme Pour l’Observation de la Terre (French mission).
b
AVHRR: Advanced Very High Resolution Radiometer
(by US government’s National Oceanic and Atmospheric Administration (NOAA).
c
MODIS: Moderate Resolution Imaging Spectroradiometer (by NASA).
d
GPM/TRMM: Global precipitation mission Tropical Rain Measurement Mission (by NASA and JAXA)
• There is a spatial - temporal resolution trade-off.
• Studies apply different data based on their properties — spatial
and/or temporal modeling.
Spaceborne/EO data applied so far in this domain
9. Index Formula
Normalized Difference Vegetation Index (NDVI) NIR−Red
NIR+Red
Normalized Difference Water Index - (NDWI) NIR−SWIR
NIR+SWIR
Blue: Blue band (≈ 490 nm)
Red: Red band (≈ 700 nm)
NIR: Near infrared band (≈ 850 nm)
SWIR: Shortwave infrared band (≈ 1500 nm—2200 nm)
Spectral indices that are of interest
10. Study Sensor EO features Limitation(s)
Espinosa et al., 2016 SPOT 5 Vegetation map
Data: SPOT data are not free.
Hence, not study reproducible.
Water bodies
Spectral feature
Urban distribution
NBR
Scavuzzo et al., 2018
German et al., 2018
MODIS and NDVI
- Method:
performance—explainability
trade-off between statistical and
machine learning models.
TRMM/GPM NDWI
Daytime LST
Night-time LST
Precipitation
NBR: Normalised Burn Ratio – Temperature proxy.
NDVI: Normalised Difference Vegetation Index.
NDWI: Normalised Difference Water Index.
LST: Land surface temperature (obtained from thermal infrared bands).
Featured studies that apply EO data for
Ae. aegypti vector/diseases modeling
11. • Spatial models:
• Free high resolution EO data.
• Temporal models:
• Nowcasting — information arrives when it is already too
late.
• Spatio-temporal modeling.
Needed contributions
12. To propose a framework for using EO data for Dengue vector
population one-week-ahead forecasting in an urban area using
RNN.
Objective
13. • RNN uses recurrent connection to capture sequential
information. Makes it fit for time series prediction.
• Given X = (x1, x2, . . . , xT ) with xt ∈ Ru time series, a simple
RNN is defined:
ht = f (ht−1, xt), (1)
ht ∈ Rv : hidden state at time t.
• Due to vanishing gradients in RNN, long short term memory
(LSTM) variant has been developed.
RECURRENT NEURAL NETWORKS (RNN)
14. • Uses gating mechanisms to solve vanishing gradient.
• ht is obtained based on input xt as follows:
ft = σ(Wf [ht−1; xt] + bf ),
it = σ(Wi [ht−1; xt] + bi ),
ot = σ(Wo[ht−1; xt] + bo),
st = ft
20. • Cluster ground truth data by K-means approach.
• Given Ae. aegypti population across m locations:
Y = {y(1), y(2), . . . , y(m)}, y(i) ∈ RP;
• Partition Y into k clusters with centers
{c(1) . . . , c(k)}, c(i) ∈ RP;
• For n EO features means over k clusters as
X = {x(1), x(2), . . . , x(k×n)}, x(i) ∈ RP across time period P.
• Resulting model is defined a NARX model:
b
ct = F([ct−T , . . . , ct−1]; [xt−T , . . . , xt−1]), (3)
Methodology — Model
21. • An encoder-decoder LSTM architecture has been used its
success in time series forecasting.
• Encoder: an LSTM that maps the input into a learned
representation ht ∈ Rv
• Decoder:
• an LSTM that maps ht to decoder output; dt
• and a fully connected layer with ReLU activation which
takes dt as input and produces b
ct.
Methodology — LSTM architecture
22. Location Batch Date range Total Traps Differentiating
weeks condition
Vila Velha 2017 10/04/2017 - 31/12/2017 36 193 -
2018 02/01/2018 - 05/10/2018 40 325 Vector
control
Serra 2017 27/04/2017 - 30/12/2017 38 567 -
2018 05/01/2018 - 05/10/2018 40 95 Vector
control
• Two different locations, two batches (ground conditions) per
locations.
• 50% points per cluster split for training/test. 20% training time
points selected for validation.
Test areas and ground truth data.
23. • Find optimal number of mosquito data clusters (k) and obtain
clusters using k-means.
• k is chosen by elbow method to reduce distortion (J):
J =
k
X
j=1
m
X
i=1
x
(j)
i − c(j)
2
, (4)
• Find optimal lag T ∈ {3, 6, 9}.
• Find optimal rep.size, v ∈ {16, 32, 64, 128}.
• Compare model to RF and KNN equivalents.
• Metric: Mean absolute error (MAE).
Experimental procedure
25. Vila Velha Serra
T ⇒ 3 6 9 3 6 9
2017
Training 0.3117 0.3392 0.4926 0.2254 0.1509 0.2451
Validation 0.4627 0.1810 0.3745 0.1985 0.3275 0.2729
Test 0.6120 0.6450 0.7565 0.4048 0.4703 0.5889
2018
Training 0.1407 0.2067 0.2762 0.2738 0.7999 0.2642
Validation 0.1998 0.4516 0.3395 0.2407 0.8574 0.3151
Test 0.3600 0.4624 0.4602 0.4418 0.9028 0.5372
Gomes et al., 2012 obtained its best dengue vector model in a
Brazilian city with T = 4 lagged effect of temperature and
precipitation. Hence, the result here (T = 3) is in line.
RESULTS: how many weeks of EO variables lagged effects?
26. Vila Velha Serra
v Year ⇒ 2017 2018 2017 2018
16 Training 0.3117 0.1407 0.2254 0.2738
Validation 0.4627 0.1998 0.1985 0.2407
Test 0.6120 0.3600 0.4048 0.4418
32 Training 0.2637 0.1501 0.1880 0.2652
Validation 0.4774 0.1975 0.2456 0.1817
Test 0.6274 0.4126 0.4231 0.3986
64 Training 0.2841 0.1781 0.1630 0.2459
Validation 0.4765 0.2231 0.1472 0.3632
Test 0.6203 0.3816 0.3984 0.4329
128 Training 0.2802 0.1808 0.2266 0.6732
Validation 0.3880 0.3189 0.2244 0.8454
Test 0.5767 0.4038 0.4440 0.5794
Lower representation dimension is obtained when encoding lower
variability.
RESULTS: Evaluating representation feature size
27. Vila Velha Serra
Model Year ⇒ 2017 2018 2017 2018
LSTM Training 0.2802 0.1407 0.1630 0.2652
Validation 0.3880 0.1998 0.1472 0.1817
Test 0.5767 0.3600 0.3984 0.3986
RF Training 0.1660 0.1300 0.1424 0.1743
Validation 0.8822 0.3920 0.3940 0.4709
Test 0.6348 0.4045 0.4502 0.5057
KNN Training 0.4237 0.3167 0.3067 0.3643
Validation 0.9755 0.5010 0.4165 0.6127
Test 0.6676 0.3869 0.4906 0.5000
Results: LSTM vs. RF vs. KNN
30. Asides forecasting risks (e.g. early warning), the following
applications are reachable:
• Spatio-temporal gap filling, especially when a trap location with
missing data had previously been classified into a cluster.
• Data collection and collation resources optimization (e.g
rotation across zones).
• The method can serve operational vector control programs in
spatio-temporal gap filling and man-power optimisation.
• RNN’s powerful prediction capability enables more robust
modeling.
Possible applications
31. • The proposed EO data-based sub-municipal (spatio-temporal)
one-step-ahead forecasting framework shows robust performance
for the task at hand.
• The method can serve operational vector control programs in
spatio-temporal gap filling and man-power optimisation.
• RNN’s powerful prediction capability enables more robust
modeling.
• LIMITATION: The model is a blackbox. Next iterations can
consider explainability approaches to improve the utility of the
proposed methodology.
Conclusions