International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Remote Sensing Method for Flood Management SystemIJMREMJournal
Flood occurred when heavy and continuous rainfall exceeding the absorptive capacity of soil and the flow
capacity of rivers, streams, and coastal areas. Land areas that are most subjected to floods are areas situated
adjacent to rivers and streams, that are known as floodplain and therefore considered as “flood-prone”. These
areas are hazardous to development activities if the vulnerability of those activities exceeds an acceptable level.
The main objectives of this study are; to identify floodplains and other susceptible areas, and to assess the
extent of disaster impact in the study area which is located at Kota Tinggi, Johor, Malaysia. This area
experienced an unprecedented flood during December of 2006 to January of 2007.Questions such as how often
and how long the floodplain will be covered by water, and at what time of year flooding can be expected need to
be answered. Thus, an understanding of the dynamic nature of floodplains is greatly required. Multi-temporal
Radarsat-1images, Landsat ETM+ image, topographical maps and land use maps were used in this study for
the purpose of delineating the flood extend before, during and after the flood event. DEM acquired from
topographic map is used to derive flood depth. The final outputs of this study are flood extent and flood depth
maps where both of these maps show the impact of the flood to environment, lives and properties. This map is
also important and can be applied to develop a comprehensive relief effort immediately after flooding.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology
Spatial-temporal Characterization of Hurricane Path using GNSS-derived Precip...CSCJournals
Global Navigation Satellite System (GNSS) precise point positioning (PPP) technique is capable of monitoring Precipitable Water Vapor (PWV) in high accuracy with low cost. As PWV is related to the initiation and development of a severe weather convective system, this study analyzed the characteristics of PWV variations over time and space to monitor and predict the path and the intensity of a severe rainfall during a hurricane. The PWV measurements are obtained by processing ground based GNSS data. The spatial and temporal variation of PWV and other meteorological variables are characterized for the time frames of before, during, and after the severe precipitation. The correlation effect between meteorological variables were mitigated by adapting a principle component analysis (PCA) and multivariate regression analysis. The method allows determining the expected movement of the rainfall up to 24 hours in advance. The proposed method was validated by analyzing the distribution pattern of the predicted PWV residual, its magnitude, and the actual observed PWV in the region. As a case study, we adopted one of the destructive and long-lived hurricane along the Florida, Georgia, North Carolina and South Carolina coast, namely, Hurricane Matthew, occurred in October 2016. From the experiment, we identified the areas closely fitting the prediction model by computing the residuals between the GNSS derived PWV measurements at each station in the test site. The residual of the predicted model is used for determining the track of extreme hurricane precipitation and potentially applied to evaluate its intensity. This study proved the effectiveness of the statistical model for forecasting the hurricane rainfall path that is potentially applied to a hazard early warning system.
Remote Sensing Method for Flood Management SystemIJMREMJournal
Flood occurred when heavy and continuous rainfall exceeding the absorptive capacity of soil and the flow
capacity of rivers, streams, and coastal areas. Land areas that are most subjected to floods are areas situated
adjacent to rivers and streams, that are known as floodplain and therefore considered as “flood-prone”. These
areas are hazardous to development activities if the vulnerability of those activities exceeds an acceptable level.
The main objectives of this study are; to identify floodplains and other susceptible areas, and to assess the
extent of disaster impact in the study area which is located at Kota Tinggi, Johor, Malaysia. This area
experienced an unprecedented flood during December of 2006 to January of 2007.Questions such as how often
and how long the floodplain will be covered by water, and at what time of year flooding can be expected need to
be answered. Thus, an understanding of the dynamic nature of floodplains is greatly required. Multi-temporal
Radarsat-1images, Landsat ETM+ image, topographical maps and land use maps were used in this study for
the purpose of delineating the flood extend before, during and after the flood event. DEM acquired from
topographic map is used to derive flood depth. The final outputs of this study are flood extent and flood depth
maps where both of these maps show the impact of the flood to environment, lives and properties. This map is
also important and can be applied to develop a comprehensive relief effort immediately after flooding.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology
Spatial-temporal Characterization of Hurricane Path using GNSS-derived Precip...CSCJournals
Global Navigation Satellite System (GNSS) precise point positioning (PPP) technique is capable of monitoring Precipitable Water Vapor (PWV) in high accuracy with low cost. As PWV is related to the initiation and development of a severe weather convective system, this study analyzed the characteristics of PWV variations over time and space to monitor and predict the path and the intensity of a severe rainfall during a hurricane. The PWV measurements are obtained by processing ground based GNSS data. The spatial and temporal variation of PWV and other meteorological variables are characterized for the time frames of before, during, and after the severe precipitation. The correlation effect between meteorological variables were mitigated by adapting a principle component analysis (PCA) and multivariate regression analysis. The method allows determining the expected movement of the rainfall up to 24 hours in advance. The proposed method was validated by analyzing the distribution pattern of the predicted PWV residual, its magnitude, and the actual observed PWV in the region. As a case study, we adopted one of the destructive and long-lived hurricane along the Florida, Georgia, North Carolina and South Carolina coast, namely, Hurricane Matthew, occurred in October 2016. From the experiment, we identified the areas closely fitting the prediction model by computing the residuals between the GNSS derived PWV measurements at each station in the test site. The residual of the predicted model is used for determining the track of extreme hurricane precipitation and potentially applied to evaluate its intensity. This study proved the effectiveness of the statistical model for forecasting the hurricane rainfall path that is potentially applied to a hazard early warning system.
Application of remote sensing in forest ecosystemaliya nasir
Established remote sensing systems provide opportunities to develop and apply new measurements of ecosystem function across landscapes, regions and continents.
New efforts to predict the consequences of ecosystem function change, both natural and human- induced, on the regional and global distributions and abundances of species should be a high research priority
Using Remote Sensing Techniques For Monitoring Ecological Changes In Lakes: C...IJERA Editor
The ability to use remote sensing in studying lake ecology lies in the capability of satellite sensors to measure
the spectral reflectance of constituents in water bodies. This reflectance can be used to determine the
concentration of the constituents of the water column through mathematical relationships. This work identified a
simple linear equation for estimating suspended matter in Lake Naivasha with reflectance in Landsat7 ETM+
image. A R² = 0.94, n = 6 for suspended matter was obtained. Archive of Landsat imagery was used to
produce maps of suspended matter concentrations in the lake. The suspended matter concentrations at five
different locations in the lake over 30 year’s period were then estimated. It was therefore concluded that the
ecological changes Lake Naivasha is experiencing is the result of the high water abstraction and the effect of
climate change.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The study was carried out using the UAV for analyzing the characteristics of debris in order to present the methodology to estimate the quantitative amount of debris caught in small river facilities. A total of six small rivers that maintained the form of a natural river were selected for collecting UAV images, and the grouping of each target in the image was carried out using the object-based classification method, and based on the object-based classification result of the UAV images, the land cover classification for the status of factors causing the generation of debris for six target sections was carried out by applying the screen digitizing method. In addition, in order to verify the accuracy of the classification result, the error matrix was performed, securing the reliability of the result. The accuracy analysis result showed that for all six target sections, the overall accuracy was 93.95% and the Kappa coefficient was 0.93, showing an excellent result.
Modification and Climate Change Analysis of surrounding Environment using Rem...iosrjce
This review is presented in three parts. The first part explains such terms as climate, climate change,
climate change adaptation, remote sensing (RS) and geographical information systems (GIS). The second part
highlights some areas where RS and GIS are applicable in climate change analysis and adaptation. Issues
considered are snow/glacier monitoring, land cover monitoring, carbon trace/accounting, atmospheric
dynamics, terrestrial temperature monitoring, biodiversity conservation, ocean and coast monitoring, erosion
monitoring and control, agriculture, flood monitoring, health and disease, drought and desertification. The
third part concludes from all illustrated instances that climate change problems will be less understood and
managed without the application of RS and GIS. While humanity is still being plagued by climate change effects,
RS and GIS play a crucial role in its management for continued human survival. Key words: Climate, Climate
Change, Climate Change Adaptation, Geographical Information System and Remote Sensing.
Editorial – Jan/Feb/Mar 2013 – Impact of the loss/addition of satellite altimetry on operational
products
Greengs all,
This issue is dedicated to the study of the impact of the loss or addion of satellite almetry on operaonal products and systems.
The first news feature by Larnicol et al. is presenng the GODAE OceanView Observing System Evaluaon Task Team which primary objecve is to
support observaonal agencies by demonstrang the impact of observaons on operaonal forecast and reanalysis systems. Its secondary objecve
is to improve the performance of operaonal ocean forecast systems.
The second paper by Labroue et al. is reminding us about the main 2012 events within the satellite almetry constellaon. For the past two decades,
we have been used to take for granted the presence of several satellites flying together. The loss of Envisat in April 2012 and the decision to
put Jason-1 on its end of life orbit is a crude reminder of this constellaon fragility. Hence during 2012, the DUACS and MyOcean Sea Level TAC
teams have contributed to secure the almetry component in the frame of operaonal oceanography.
The third paper by Labroue et al. is displaying the potenal offered by Cryosat-2 for the mesoscale signal. The added value brought by Cryosat-2 as a
complement to the exisng almetry constellaon is discussed as well as how Cryosat-2 could contribute to secure the almetry constellaon and
thus the operaonal oceanography. Cryosat-2 mission has been introduced into the Near Real Time Sea Level system since February 2012 and has
been added to the Delayed Time system in April 2012.
The fourth paper by Remy et al. addresses the impact of the change of the satellite constellaon on the French Mercator Ocean analysis and forecas
ng systems. The impact of the loss of the ENVISAT and Jason1 along track Sea Level Anomaly data in the beginning of the year 2012 in the real
me products is studied. A dedicated set of Observing System Experiments (OSEs) is performed and preliminary results are shown. An OSE involves
running a copy of an exisng assimilaon run where some observaons are excluded. The difference between this run and the original run assimila
ng all the observaons allows a detailed assessment of the impact the observaons have on the assimilaon system.
Finally, the fi8h paper by Lea et al. is showing a number of Observing System Experiments (OSEs) to assess the impact of the observing network on
FOAM, the UK Met Office’s ocean assimilaon and forecasng system, as part of GODAE OceanView. A parallel version of the FOAM operaonal
system was run, during April 2011, withholding Jason-2 almeter observaons. Withholding Jason-2 removed 43% of the almeter data and resulted
in a 4% increase in the RMS SSH observaon-minus-background differences and around ±2ºC small scale changes in 100m temperature as well as
around ±0.2 psu changes in surface salinity.
We will meet again in April 2013 for a j
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Application of remote sensing in forest ecosystemaliya nasir
Established remote sensing systems provide opportunities to develop and apply new measurements of ecosystem function across landscapes, regions and continents.
New efforts to predict the consequences of ecosystem function change, both natural and human- induced, on the regional and global distributions and abundances of species should be a high research priority
Using Remote Sensing Techniques For Monitoring Ecological Changes In Lakes: C...IJERA Editor
The ability to use remote sensing in studying lake ecology lies in the capability of satellite sensors to measure
the spectral reflectance of constituents in water bodies. This reflectance can be used to determine the
concentration of the constituents of the water column through mathematical relationships. This work identified a
simple linear equation for estimating suspended matter in Lake Naivasha with reflectance in Landsat7 ETM+
image. A R² = 0.94, n = 6 for suspended matter was obtained. Archive of Landsat imagery was used to
produce maps of suspended matter concentrations in the lake. The suspended matter concentrations at five
different locations in the lake over 30 year’s period were then estimated. It was therefore concluded that the
ecological changes Lake Naivasha is experiencing is the result of the high water abstraction and the effect of
climate change.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The study was carried out using the UAV for analyzing the characteristics of debris in order to present the methodology to estimate the quantitative amount of debris caught in small river facilities. A total of six small rivers that maintained the form of a natural river were selected for collecting UAV images, and the grouping of each target in the image was carried out using the object-based classification method, and based on the object-based classification result of the UAV images, the land cover classification for the status of factors causing the generation of debris for six target sections was carried out by applying the screen digitizing method. In addition, in order to verify the accuracy of the classification result, the error matrix was performed, securing the reliability of the result. The accuracy analysis result showed that for all six target sections, the overall accuracy was 93.95% and the Kappa coefficient was 0.93, showing an excellent result.
Modification and Climate Change Analysis of surrounding Environment using Rem...iosrjce
This review is presented in three parts. The first part explains such terms as climate, climate change,
climate change adaptation, remote sensing (RS) and geographical information systems (GIS). The second part
highlights some areas where RS and GIS are applicable in climate change analysis and adaptation. Issues
considered are snow/glacier monitoring, land cover monitoring, carbon trace/accounting, atmospheric
dynamics, terrestrial temperature monitoring, biodiversity conservation, ocean and coast monitoring, erosion
monitoring and control, agriculture, flood monitoring, health and disease, drought and desertification. The
third part concludes from all illustrated instances that climate change problems will be less understood and
managed without the application of RS and GIS. While humanity is still being plagued by climate change effects,
RS and GIS play a crucial role in its management for continued human survival. Key words: Climate, Climate
Change, Climate Change Adaptation, Geographical Information System and Remote Sensing.
Editorial – Jan/Feb/Mar 2013 – Impact of the loss/addition of satellite altimetry on operational
products
Greengs all,
This issue is dedicated to the study of the impact of the loss or addion of satellite almetry on operaonal products and systems.
The first news feature by Larnicol et al. is presenng the GODAE OceanView Observing System Evaluaon Task Team which primary objecve is to
support observaonal agencies by demonstrang the impact of observaons on operaonal forecast and reanalysis systems. Its secondary objecve
is to improve the performance of operaonal ocean forecast systems.
The second paper by Labroue et al. is reminding us about the main 2012 events within the satellite almetry constellaon. For the past two decades,
we have been used to take for granted the presence of several satellites flying together. The loss of Envisat in April 2012 and the decision to
put Jason-1 on its end of life orbit is a crude reminder of this constellaon fragility. Hence during 2012, the DUACS and MyOcean Sea Level TAC
teams have contributed to secure the almetry component in the frame of operaonal oceanography.
The third paper by Labroue et al. is displaying the potenal offered by Cryosat-2 for the mesoscale signal. The added value brought by Cryosat-2 as a
complement to the exisng almetry constellaon is discussed as well as how Cryosat-2 could contribute to secure the almetry constellaon and
thus the operaonal oceanography. Cryosat-2 mission has been introduced into the Near Real Time Sea Level system since February 2012 and has
been added to the Delayed Time system in April 2012.
The fourth paper by Remy et al. addresses the impact of the change of the satellite constellaon on the French Mercator Ocean analysis and forecas
ng systems. The impact of the loss of the ENVISAT and Jason1 along track Sea Level Anomaly data in the beginning of the year 2012 in the real
me products is studied. A dedicated set of Observing System Experiments (OSEs) is performed and preliminary results are shown. An OSE involves
running a copy of an exisng assimilaon run where some observaons are excluded. The difference between this run and the original run assimila
ng all the observaons allows a detailed assessment of the impact the observaons have on the assimilaon system.
Finally, the fi8h paper by Lea et al. is showing a number of Observing System Experiments (OSEs) to assess the impact of the observing network on
FOAM, the UK Met Office’s ocean assimilaon and forecasng system, as part of GODAE OceanView. A parallel version of the FOAM operaonal
system was run, during April 2011, withholding Jason-2 almeter observaons. Withholding Jason-2 removed 43% of the almeter data and resulted
in a 4% increase in the RMS SSH observaon-minus-background differences and around ±2ºC small scale changes in 100m temperature as well as
around ±0.2 psu changes in surface salinity.
We will meet again in April 2013 for a j
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) aims to cover the latest outstanding developments in the field of all Engineering Technologies & science.
International Journal of Engineering Research and Applications (IJERA) is a team of researchers not publication services or private publications running the journals for monetary benefits, we are association of scientists and academia who focus only on supporting authors who want to publish their work. The articles published in our journal can be accessed online, all the articles will be archived for real time access.
Our journal system primarily aims to bring out the research talent and the works done by sciaentists, academia, engineers, practitioners, scholars, post graduate students of engineering and science. This journal aims to cover the scientific research in a broader sense and not publishing a niche area of research facilitating researchers from various verticals to publish their papers. It is also aimed to provide a platform for the researchers to publish in a shorter of time, enabling them to continue further All articles published are freely available to scientific researchers in the Government agencies,educators and the general public. We are taking serious efforts to promote our journal across the globe in various ways, we are sure that our journal will act as a scientific platform for all researchers to publish their works online.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Flood Detection Using Empirical Bayesian NetworksIOSRJECE
Flood mapping from Synthetic Aperture Radar (SAR) data has attracted considerable attention in recent years. Flood is not only one of the widest spread natural disasters, which regularly causes large numbers of casualties with rising economic loss, extensive homelessness and disaster induced disease, but is also the most frequent disaster type. A valuable information source for such a procedure can be remote sensing synthetic aperture radar (SAR) imagery. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground. For this reason, a data fusion approach of remote sensing data with ancillary information can be particularly useful. In this work, an Empirical Bayesian network is proposed to integrate remotely sensed data, such as multitemporal SAR intensity images and interferometric-SAR coherence data, with geomorphic and other ground information where as in the previous work the authors has used the Bayesian networks. The methodology is tested on a case study regarding a flood that occurred in the Visakhapatnam (India) on October 2014, monitored using a time series of TerraSAR-X data. It is shown that the synergetic use of different information layers can help to detect more precisely the areas affected by the flood, reducing false alarms and missed identifications which may affect algorithms based on data from a single source. The produced flood maps are compared to data obtained independently from the analysis of optical images; the comparison indicates that the proposed methodology is able to reliably follow the temporal evolution of the phenomenon, assigning high probability to areas most likely to be flooded, in spite of their heterogeneous temporal SAR/InSAR signatures, reaching accuracies of up to 89%.
Airborne gravity anomaly over Delta State in the Niger delta basin of Nigeria has been interpreted to obtain the structural trends/types and depth to basement in the state. The residual gravity anomaly obtained from a second order polynomial operation on the observed field data was enhanced by a first order filtering operation based on the regional geology. This was converted to a gridded data and analyzed qualitatively to reveal NS and EW trending subsurface structures. Inverse and forward modeling using Oasis Montaj software were applied to selected portions using geological models of sphere and dyke to reveal syncline and anticline structures at depths of between 2005 m to 7372 m, with density contrast of between 1.12 gcm-3 and 2.70 gcm-3. The Euler deconvolution operation with a structural index of one, reveal depths between 124.2 to 16,000 m. The results show that the maximum depth to basement in the area occurs in the northern part of the state with maximum depth of 16,000 m.
Big data and remote sensing: A new software of ingestion IJECEIAES
Currently, remote sensing is widely used in environmental monitoring applications, mostly air quality mapping and climate change supervision. However, satellite sensors occur massive volumes of data in near-real-time, stored in multiple formats and are provided with high velocity and variety. Besides, the processing of satellite big data is challenging. Thus, this study aims to approve that satellite data are big data and proposes a new big data architecture for satellite data processing. The developed software is enabling an efficient remote sensing big data ingestion and preprocessing. As a result, the experiment results show that 86 percent of the unnecessary daily files are discarded with a data cleansing of 20 percent of the erroneous and inaccurate plots. The final output is integrated into the Hadoop system, especially the HDFS, HBase, and Hive, for extra calculation and processing.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
NISAR
Oil, Gas, and Water Underground Reservoirs
NASA
National Aeronautics and Space Administration
NASA-ISRO Synthetic Aperture Radar (NISAR)
By
Dr. Pankaj Dhussa
TRACKING ANALYSIS OF HURRICANE GONZALO USING AIRBORNE MICROWAVE RADIOMETERjmicro
There is a huge consideration in the use of microwave airborne radiometry for remote sensing instead of satellite, the important role of airborne way is how to provide high accuracy real time data. The airborne hurricane tracking is an important method compared with the space borne method, which is developed by NASA Marshall Space Flight center to provide high resolution measurements. By flying special aircraft equipment using synthetic thinned array radiometry technology and included all critical measurements such as hurricane eye location, speed of wind and the pressure. This paper describes the data analysis of best track positions for Hurricane Gonzalo based on the date collected by airborne microwave radiometry. Significant analysis comes from comparing the airborne data with the surface observations from ship reports. The vast majority is to estimate peak intensity and minimum central pressure of Gonzalo from 12 to 19 October 2014, based on blend of SFMR flight-level winds and pressure retrievals from observing brightness temperatures. SFMR: Stepped-Frequency Microwave Radiometer is a highly developed tool developed by the Langley Research Center that is designed to measure the wind speed at the ocean’s surface, and the rain fall rates within the storm accurately and continuously. The work also addresses the realistic details of the locations and the valuable information about the pressure and wind speed, which is very critical to predict the growth and movement to get the idea for future monitoring of the hurricane disasters. Also presents a conceptual of step frequency microwave radiometer in airborne side. The objective of this research is tracking analysis techniques based on comparing the satellite, ship and airborne reports to get higher accuracy. The system operates at four spaced frequencies in the range between 4 GHz and 7 GHz provides wide measurements between ± 45 incidence angle. Gonzalo 2014 is an example; the best results of retrieved wind speed, locations and pressure are presented. There are several national projects have been developed for earth observation, such as fire, hurricane and border surveillance. In this work, the efficient high resolution techniques of C-band, four-frequency, the work also addresses a valuable information comes from the airborne system and the prediction way of the growth and movement of hurricanes. In passive microwave remote sensing from space at C band has the penetrating advantages of atmosphere. Airborne system is able to work in full Polari-metric in four bands, C, X, S, L and P-band, which cover the wavelengths from 3 to 85 cm. The modes of measurement contain single channel operation wavelength and polarization.
Flood is one of the natural disaster known to be part of the earth biophysical processes, which its occurrence can be devastating; due to mostly anthropogenic activities and climatological factors. The aim of the research is to identify and map the extent at which the impact of flood due to intense rainfall and rise in water in the study area using geospatial techniques and the specific objectives are to carry out terrain analysis of the study area and to generate flood indicator maps of the study area. The study analyzed rain fall data;, the drainage system and Shuttle Radar Topographic Mission (SRTM 30m) of the area. ArcGIS 10.8 was to modelled and to generate the contributing factors map of the study area. The drainage system was generated through on-screen digitization of topographic map of scale 1:50,000 of Ondo South-West. The mean annual rainfall of Lagos State was generated in the ArcGIS environment from the rainfall data through spatial analysis tool. The SRTM was used in terrain analysis of the study area. The results generated showed the lowest mean annual rain fall of the area 1,700mm and the highest mean annual rain fall was 2,440mm. Digital elevation model (DEM), slope, flow direction were generated from the SRTM. Drainage density of the area was generated using the drainage system. The slope map of the entire area which are classified into five slope classes of very high (14%-48.5%) to high (7.6%-13.9%) to moderately high (4.2%-7.6%) to low (1.5%-4.2%) and very low (0. % - 1.2%).
Seawater Intrusion Vulnerability Assessment of a Coastal Aquifer: North Coast...IJERA Editor
Groundwater pollution in the north coast of Mombasa is not only from surface sources but also from the
intrusion of seawater via the Indian Ocean and creeks. This study assessed the vulnerability of the coastal aquifer
to seawater intrusion using GALDIT index overlay method with the aid of GIS. Thematic maps of six major
factors affecting seawater intrusion were prepared, and given appropriate weightages and ratings. These maps
were overlaid, spatially analyzed to produce vulnerability maps and described based on low, moderate or high
vulnerabilities. The results revealed a significant increase in percentage land cover for low vulnerability areas
and a slight increase for high vulnerability regions between the pre-rains and the peak of raining season. The
outcomes of this study provide useful insights on effective groundwater management for the study area.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Ll3519561960
1. Igor Ogashawara et al Int. Journal of Engineering Research and Application
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1956-1960
RESEARCH ARTICLE
www.ijera.com
OPEN ACCESS
The Use of Optical Remote Sensing For Mapping Flooded Areas
Igor Ogashawara*, Marcelo Pedroso Curtarelli* and Celso M. Ferreira**
*(Remote Sensing Division, National Institute for Space Research, Brazil)
** (Department of Civil, Environmental and Infrastructure Engineering, George Mason University, USA)
ABSTRACT
Flood maps are a crucial tool to support emergency management, disaster recovery and risk reduction planning.
Traditional flood mapping methods are time-consuming, labor intensive, and costly. Our goal in this paper is to
introduce a novel technique to aggregate knowledge and information to map coastal flooded areas. We proposed
a Difference of Normalized Difference Water Indices (DNDWI) derived from two LANDSAT-5/TM surface
reflectance product acquired before and after the passage of Hurricane Ike, for Upper Texas in September of
2008. The reference flooded area was delineated interpolating the maximum surge in each location using a spline
with barriers method with high tension and a 30 meter Digital Elevation Model (DEM). It was noticed that
NDWI values decreased after the hurricane landfall on average from 0.226 to 0.122 for flooded area. However
for the non-flooded areas it increased from 0.292 to 0.300. Results from the Monte Carlo simulation showed that
mapping flooded areas with DNDWI got an accuracy of 85.68% while the non-flooded areas got an accuracy of
92.13%. Thus, DNDWI is promising tool for mapping flooded areas since it is a cheaper and simple technique
which can be applied rapidly for several areas of the planet.
Keywords - Flood mapping, Hurricane, Monte Carlo Simulation, NDWI, Optical Remote Sensing
I.
INTRODUCTION
Hurricanes are one of the most costly natural
disasters in the United States [1] and recent storms
such as Hurricane Sandy (2012), Ike (2008) and
Katrina (2005) have caused major infrastructure
damage and losses of lives along the East and Gulf
Coasts [2]. Tropical cyclones are also a major cause of
flooding and damage in several regions worldwide
including recent events in the Bay of Bengal, Typhoon
Phailin (2013), and historically deadly storms such as
Tropical cyclone Thelma (1991), in the North Pacific
Ocean, and Cyclone Zoe (2002), in the South Pacific
Ocean, among others. Coastal flooding is one of the
major hazards to accompany a tropical cyclone
landfall [3] and can be aggravated by the combination
of the storm tidal surge and rainfall-runoff from the
heavy precipitations.
Traditionally, coastal flooding due to tropical
cyclones has been estimated by measured water levels
on buoys and coastal gages (e.g., [4-5]). Although
these monitoring networks provide good historical
data for coastal flooding, it lacks spatial information
due to the limited number of stations over large areas.
Recent developments of physics based numerical
models using High Performance Computing (HPC)
(e.g., [6]) in addition to an increasingly large volume
of high resolution data (e.g., topo/bathymetry, land
use, wind fields) has led to an unprecedented
improvement in accuracy of tropical cyclones flood
prediction and mapping. A combination of numerical
modeling and measured data for model validation and
calibration is currently the state of the art in predicting
coastal flooding [7]. Recently the United States
Geological Survey (USGS) launched the Inland
www.ijera.com
Page
Storm-Tide Monitoring Program [8] that provides
unprecedented detail in coastal flooding monitoring
during hurricane events on the United States coastal
areas. Although recent advances in technology and
methods, the forecasting and mapping of coastal
flooding spatial extent is still a challenge especially in
large areas without extensive instrumentation.
In this way, remote sensing data can provide
useful information to mapping coastal flooding over
large area. The main advantages in the use of remote
sensing data are the synoptic view of large areas,
spatial variability of data and repetitive acquisition.
Moreover, the cost of data acquired by sensors
onboard orbital platforms can be lower than the data
acquired by conventional methods (discounting the
cost of the satellites) [9]. Recently, remote sensing
data have been applied to identify flood areas using
the Environmental Satellite (ENVISAT) advanced
synthetic aperture radar (ASAR) and Landsat
Thematic Mapper (TM) optical imagery to document
the flooded extent of Hurricane Ike (2008) by [10].
Subsequently, [11] presented a summary of the
limitations and potentials of satellite imagery to
monitor and map coastal flooding for Hurricanes
Gustav and Ike (2008) demonstrating that the
correspondence between ground data and ASARbased flood mapping ranged from 86 to 96% for water
levels higher than 0.8 m.
Although the good correspondence between
ground data and RADAR-based flood mapping, such
as demonstrated by [11], RADAR (Radio Detection
and Ranging) sensors (e.g., ASAR, TerraSAR,
RADARSAT) are not always imaging freely and
generally their products have high costs. Moreover,
1956 |
2. Igor Ogashawara et al Int. Journal of Engineering Research and Application
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1956-1960
the data acquisition is sparse and over small areas.
Therefore, it is necessary to develop new techniques
based only on optical remote sensing data, which
generally is available freely, have more options of
spatial and temporal resolutions and has a historical
time series with more than 10 years records (e.g.,
MODIS and TM sensors), which allows mapping the
flooding extent of past events.
The objective of this study is to introduce a
novel technique based on optical remote sensing
techniques in order to aggregate knowledge and
information to map coastal flood areas. We present a
case study for the coastal flooding caused by
Hurricane Ike on the Texas coast and develop our
method using the Normalized Difference Water Index
(NDWI) which was derived from two the medium
resolution LANDSAT-5/TM surface reflectance
product from the LANDSAT Climate Data Record
(CDR).
II.
STUDY AREA AND EVENT
The study area is the upper Texas coast,
USA, which was directly impacted by Hurricane Ike
in 2008 more specifically between the coordinates
30°46'23"N 95°41'48"W and 29°19'53"N 94°1'57"W
(Fig. 1). According to [12], the frequency of
hurricanes in this region is one about every six years
and the annual probability of hurricanes occurrence is
around 31%. However, more than 40 events were
registered in the past century (1 at each three year),
which indicates an increase in the frequency of events.
Hurricane Ike was a Category 2 hurricane
when it made landfall on the Texas coast [2] causing
severe damage to the State of Texas, Bahamas and
Cuba as well. The storm winds reached 230 km h-1
with the lowest central pressure at 935 mbar leaving
an estimated death toll of 103 people and around 40
billion dollars in damages.
III.
www.ijera.com
DATASET
The dataset used in this study include in situ
measurements and remote sensing images. The in situ
data were collected by the USGS mobile storm surge
network [8], which provide atmospheric pressure and
water level data at each 6 minutes. The data were
measured using a pressure transducer (HOBO
Onset®) and stored in the USGS database [8].
Topography was obtained from a Digital Elevation
Model (DEM) dataset extracted from the National
Elevation Dataset (NED) [13] representing the entire
region topography at a resolution of 1 arc-second.
The remote sensing data comprise images
collected by the TM sensor, onboard LANDSAT-5
satellite. This images are provided with 7 spectral
bands (from the visible to thermal spectral regions),
quantized in 8 bits and with 30 meters spatial
resolution, except the thermal band which has 120 m
spatial resolution. The images are acquired at each 16
days. In this study we used the surface reflectance
product from the LANDSAT CDR. This product is
generated from specialized software called Landsat
Ecosystem Disturbance Adaptive Processing System
(LEDAPS) which provide images converted for
reflectance values and corrected for the atmospheric
effects. More information about this product is
available at [14]
IV.
METHODS
4.1.
REFERENCE FLOODED AREA
The reference flooded area was determined
based on the methodology proposed by [15] and
consists of spatially interpolating the maximum flood
height at each monitoring station and calculating the
water depths based on the DEM. For this study we
considered 59 recording stations along the study area.
The maximum water heights were extracted from the
recorded time series to represent the maximum flood
level at each location. A spatial interpolation using a
spline with higher tension (weight of 20 as suggested
by [15] was used to develop the maximum water level
coverage for the region. The water depths were
calculated by subtracting the maximum flood surface
from the DEM using the vertical NAVD88 Datum for
the region. The resulting coverage was re-classified to
remove the dry areas defining the estimated flood
extent.
4.2.
Figure 1. Study area in the Upper Texas, near Hilton
and reference flooded area. LANDSAT-5/TM image
acquired on 4 September true color composition.
www.ijera.com
Page
NORMALIZED DIFFERENCE WATER INDEX
As vegetation is one of the most affected
targets by flood in coastal areas, changes in its water
content could be used to identify the affected areas.
The NDWI, which is also called the leaf area waterabsent index, could be an alternative for optical
remote sensing to map flooded areas. It is possible
since this index (1) estimates the water content within
vegetation [16]:
0.86m 1.24m
NDWI
0.86m 1.24m
1957 |
3. Igor Ogashawara et al Int. Journal of Engineering Research and Application
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1956-1960
where, mis the reflectance at 0.86 m and
mis the reflectance at 1.24 m.
According to [16], it measures the liquid
water molecules in vegetation that interact with solar
radiation. The NDWI has been widely used because it
is less sensitive to atmospheric scattering effects
compared with the NDVI. However, similar to the
NDVI, the NDWI did not completely remove the soil
background reflectance effects. Using Landsat-5/TM
bands, the NDWI was estimated as follows (2):
(TM B4 ) (TM B5 )
NDWI
TM B4 TM B5
where, is the reflectance at LANDSAT-5/TM
band 4 and is the reflectance at LANDSAT5/TM band 5.
We used band 5 (1.65 μm) as an
approximation of 1.24 μm [16]. This procedure was
previously described in several studies [17-19].
V.
www.ijera.com
RESULTS AND DISCUSSIONS
5.1.
DNDWI CLASSIFICATION
For the DNDWI classification we collected
1000 points for each class based on the reference and
calculate their univariate statistics and a histogram
(Fig. 2). For the 1000 points from the flooded area,
the average value was 0.107 while for the non-flooded
are the average was -0.011.The median for the flooded
area was 0.140 and the non-flooded got a median
value of 0.003. It showed that the DNDWI is higher
for flood areas and lower for non-flooded ones. It
happened due to the fact that in both areas there were
precipitation, however, as the NDWI measures the
water content in the vegetation, at flooded areas the
index was saturated while in non-flooded area it
worked well.
4.3.
METHOD DEVELOPMENT
The use of NDWI to classify flooded areas
from a hurricane event consists in analyzing NDWI
values from two different dates (before and after the
hurricane). The difference among the NDWI values
from the two dates is here called Difference of NDWI
(DNDWI). Two thousand values of DNDWI were
used to classify flooded and non-flooded areas
according to the established limits for each of these
two classes. These limits were computed from the
univariate statistics of DNDWI values from each of
the reference class. Average and standard deviations
for each class was used to determine the lower and
upper limit to categorize the DNDWI values. The
DNDWI threshold for each class was chosen as
plus/minus one standard deviation from the mean
value plus a constant (see Table 1 in the results and
discussion section).
4.4.
VALIDATION
It was used to validate the NDNWI
classification a Monte Carlo Method (MCM). The
MCM, also known as statistical simulation, is defined
as any method that uses sequences of random numbers
to perform a simulation. This process was based on
the outcome of many simulations using random
numbers to obtain a probable solution. This technique
provides an approximate, quick answer and a high
level of accuracy; however, it is possible that accuracy
is increased with additional simulations [20]. We
collected 2000 stratified points of the DNDWI image,
each class (flooded and non-flooded) got 1000 random
values each. From these values an average from 20 of
them were calculated 10,000 times. These 10,000
average values were used to evaluate the proposed
classification using the thresholds analysis. The
analysis were conducted by fitting all 10,000 values in
the thresholds and them analyzing the number that got
in the correct class.
www.ijera.com
Page
Figure 2. Histograms of DNDWI from the two classes
according to the reference classification.
The values of DNDWI can be explained by
the mean values of NDWI before and after the
Hurricane Ike. The mean value of the NDWI before
the hurricane was 0.292 for the non-flooded areas and
0.2264 for the flooded areas. They changed to 0.300 in
the non-flooded areas and to 0.122 for flooded areas.
It showed an increase of 2.74% in the NDWI values
for the non-flooded area and a decrease of 45.82% of
NDWI values for the flooded areas. Thus as the index
values increases from the first to the second image in
1958 |
4. Igor Ogashawara et al Int. Journal of Engineering Research and Application
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1956-1960
the non-flooded areas, the DNDWI tends to decrease
while the opposite is also true, in the flooded areas
DNDWI values tend to increase due to the decrease of
NDWI in the second image. Due to these
characteristics the thresholds were calculated from the
average plus/minus one standard deviation from the
mean value plus a constant (Table 1).
Table 1. Thresholds for the DNDWI classification.
Classes
Threshold
<0.05
Non-flooded
>0.05
Flooded
5.2.
VALIDATION USING MCM
To validate the choice of thresholds the
10,000 values from the simulated average of DNDWI
values were analyzed. For flooded areas the values
ranged from -0.099 to 0.285, while the values for nonflooded areas ranged from -0.193 to 0.143. The mean
values of these two series were 0.106 and -0.010 for
the flooded and non-flooded areas respectively. A
histogram of all 10,000 DNDWI values and the
threshold limit are shown on Fig. 3.
Figure 3. Histogram of DNDWI simulated average by
MCM.
MCM results also showed that for flooded
areas 8568 values were classified as flooded by the
threshold analysis. For non-flooded the number of
values in the range proposed in the threshold was
9213. It showed that DNDWI got an accuracy of
85.68% for mapping flooded areas, showing that it is a
potential tool to help managers and policy makers to
analyze the impacted areas.
We also calculate the results of the Wilcoxon
signed rank test for the 10,000 values of DNDWI
through the MCM. It showed that the null hypothesis
was refuted with 5% significance (p-value < 0.001);
thus, there was undoubtedly a difference in the
DNDWI values from flooded and non-flooded areas.
VI.
CONCLUSION
A methodology for mapping spatial
variations of flood inundation caused by hurricanes
events using optical remotely sensed image series was
developed in this study. Since flood prevention,
management and emergency response is an important
www.ijera.com
Page
www.ijera.com
issue for policy makers, flood detection using remote
sensing can improve the number of monitored areas in
remotely accessed places or in places without any
monitoring program. As optical orbital sensors, like
Landsat TM, ETM+ and OLI family, are collecting
non-stop data from the entire planet, the use of an
optical sensor could enhance the knowledge of
flooding spatial mapping in areas with lack of data.
Our results showed that the DNDWI and a threshold
analysis method for distinguish flooded areas could be
a potential tool to enhance the knowledge of tropical
cyclones flooding mapping. It was observed, from a
MCM technique, that 85.68% of 10000 mean values
from an interaction of 20 to 1000 DNDWI values were
accurately classified as flooded areas. For the nonflooded area an accuracy of 92.13% was found.
However, these results were based on only
one event (Hurricane Ike) and should be extended to
other study areas. We also observed that additional
spectral behavior studies are needed to explain the
relationship between water content and vegetation
spectral response in flooded areas. Nevertheless, we
proposed a methodology which could be an useful tool
for countries without any flooding monitoring
program and RADAR imagery cover, since the
mapping of flooded areas is an important issue for the
economy and the rebuilding of the affected region.
Although the use of Synthetic Aperture Radar has
been previously applied, the cost for buying the
images is high as well as it is labor intensity. Using
optical remote sensing is not only cheaper but also
much easier to manage the data. Through the advance
of orbital hyperspectral sensors like the Hyperspectral
Imager for the Coastal Ocean (HICO), it will be
possible to identify the key spectral ranges to identify
the elevated degree of water content in vegetation.
Thus the hyperspectral studies will enhance the use of
optical remote sensing to map flooded areas.
VII.
ACKNOWLEDGEMENTS
The first author is grateful to the Brazilian
Federal Agency for Support and Evaluation of
Graduate Education (CAPES) for the masters’
scholarship. The second author wishes to thanks the
National Counsel of Technological and Scientific
Development (CNPQ) (grants 161233/2013-9) for the
PhD scholarship.
REFERENCES
[1]
N. Lott and T. Ross, Tracking and
Evaluating U.S. Billion Dollar Weather
Disasters, 1980-2005, National Climatic
Data Center (NCDC), 2006. Available at:
http://www1.ncdc.noaa.gov/pub/data/papers/
200686ams1.2nlfree.pdf. Accessed on: 12
Oct 2013.
[2]
National
Oceanic
and
Atmospheric
Administration (NOAA), Tropical Cyclone
Report: Hurricane Ike, 2013a. Available at:
1959 |
5. Igor Ogashawara et al Int. Journal of Engineering Research and Application
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1956-1960
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
http://www.nhc.noaa.gov/2008atlan.shtml.
Accessed on:10 Oct. 2013.
D. Resio and J. Westerink, Modeling the
physics of storm surges, Physics Today,
61(9), 2008, 33-38.
National
Oceanic
and
Atmospheric
Administration (NOAA), Tides and currents,
2013b.
Available
at:
http://tidesandcurrents.noaa.gov/. Accessed
on: 15 Oct. 2013.
United Nations Education, Scientific and
Cultural Organization (UNESCO), The
Global Ocean Observing System (GOOS),
2013.
Available
at:
http://www.iocgoos.org/Accessed on: 11 Oct. 2013.
J. Dietrich, M. Zijlema, J. Westerink, L.
Holthuijsen, C. Dawson, R. Luettich, R.
Jensen, J. Smith, G. Stelling and G. Stone,
Modeling hurricane waves and storm surge
using
integrally-coupled,
scalable
computations, Coastal Engineering, 58(1),
2011, 45-65.
C. Dawson, E.J. Kubatko, J.J. Westerink, C.
Trahan, C. Mirabito, C. Michoski, N. Panda,
Discontinuous
Galerkin
methods
for
modeling hurricane storm surge, Advances in
Water Resources, 34(9), 2011, 1165-1176.
United States Geological Survey (USGS),
The storm-tide monitoring program, 2013a.
Available
at:
http://water.usgs.gov/osw/programs/storm_su
rge1.html.Accessed on: 15 Oct. 2013.
J.R. Jensen, Remote sensing of the
environment: An earth resource perspective
(Boca Raton, FL: Prentice Hall, 2006).
E. Ramsey, D. Werle, Z. Lu, A.
Rangoonwala and Y. Suzuoki, A case of
timely satellite acquisition in support of
coastal emergency and environmental
response management, Journal of Coastal
Research, 25(5), 2009, 1168-1172.
E. Ramsey, D. Werle, Y. Suzuoki, A.
Rangoonwala and Z. Lu, Limitations and
potential of satellite imagery to monitor
environmental response to coastal flooding,
Journal of Coastal Research, 28(2), 2012,
457-476.
D. Roth, Texas Hurricane History, National
Weather Service, 2010. Available at:
http://www.wpc.ncep.noaa.gov/research/txhu
r.pdf.Accessed on: 11 Oct. 2013.
United States Geological Survey (USGS),
National Elevation Dataset, 2013b. Available
at: http://ned.usgs.gov/.Accessed on: 04 Aug.
2013.
J.G. Masek, E.F. Vermote, N. Saleous, R.
Wolfe, F.G. Hall, F. Huemmrich, F. Gao, J.
Kutler, and T.K. Lim, A Landsat surface
reflectance data set for North America, 1990-
www.ijera.com
Page
[15]
[16]
[17]
[18]
[19]
[20]
www.ijera.com
2000, Geoscience and Remote Sensing
Letters, 3(1), 2006, 68-72.
C. Berenbrok, R.R. Mason and S.F.
Blanchard, Mapping Hurricane Rita inland
storm tide, Journal of Food Risk
Management, 2(1), 2009, 76-82
B.C. Gao, NDWI - A Normalized Difference
Water Index for remote sensing of vegetation
liquid water from space, Remote Sensing of
Environment, 58(3), 1996, 257–266.
X.L. Chen, H.M. Zhao, P.X. Li and Z.Y. Yin,
Remote sensing image-based analysis of the
relationship between urban heat island and
land use/cover changes, Remote Sensing of
Environment, 104(2), 2006, 133–146.
D.D. Bosch and M.H. Cosh, SMEX03
Landsat Thematic Mapper NDVI and NDWI
(Boulder, CO: National Snow and Ice Data
Center, 2008).
T.J. Jackson and M.H. Cosh, SMEX03
Landsat Thematic Mapper NDVI and NDWI
(Boulder, CO: National Snow and Ice Data
Center, 2007).
Y. Hong, K.-L. Hsu, H. Moradkhani and S.
Sorooshian, Uncertainty quantification of
satellite precipitation estimation and Monte
Carlo assessment of the error propagation
into hydrologic response, Water Resource
Research, 42(8), 2006, 1-15.
1960 |