Biopesticide (2).pptx .This slides helps to know the different types of biop...
DA 2 NDMM.pptx
1. NATURAL DISASTER MITIGATION AND
MANAGEMENT
DIGITAL ASSIGNMENT 2
TITLE - COASTAL DISASTER MANAGEMENT
GROUP MEMBERS-
DIVYANSH GOLANI- 19BCL0077
JATIN DHINGRA- 19BIT0187
GURU PRASAD TRIPATHI- 20BME0291
2. AIM- Strategy for tracking oil-spill trajectories to stop marine pollution
Methods- two-step oil-spill forecasting system for predicting the trajectories of oil slicks. First, an X-band
radar is established to detect oil slicks close to the oil-spill site and obtain the location and extent of oil spills. For
this purpose, a procedure for detecting the oil-slick area from radar images is developed. The information of the
location and extent of oil spills can be used as the initial conditions for simulating the subsequent oil-spill
trajectories. Furthermore, because the radar can also measure the ocean surface currents, this information when
combined with available wind data can provide a fast assessment of the oil-slick trajectories. The results are
useful for the execution of rescue missions and for rapid deployment of emergency equipment. The simulated
water surface elevations and ocean surface current obtained from the SCHISM were compared with the
monitoring data to verify the accuracy of the hydrodynamic model incorporated in the SCHISM. The comparison
reveals that both results agree closely with the monitoring data. After obtaining the surface elevation and current
from the SCHISM, the Lagrangian particle-tracking model incorporated in the SCHISM was used to simulate the
trajectories of oil spills under various wind conditions, such as the observed wind speeds and directions from a
nearby data buoy, or the wind fields predicted by the WRF model.
Chiu, C. M., Huang, C. J., Wu, L. C., Zhang, Y. J., Chuang, L. Z. H., Fan, Y., & Yu, H.
C. (2018). Forecasting of oil-spill trajectories by using SCHISM and X-band radar.
Marine pollution bulletin, 137, 566-581
3. Tools- SCHISM-SCHISM (Semi-implicit Cross-scale Hydroscience Integrated System Model) is an open-source
community-supported modeling system based on unstructured grids, designed for seamless simulation of 3D baroclinic
circulation across creek-lake-river-estuary-shelf-ocean scales. It uses a highly efficient and accurate semi-implicit finite-
element/finite-volume method with Eulerian-Lagrangian algorithm to solve the Navier-Stokes equations (in hydrostatic
form), in order to addresses a wide range of physical and biological processes.
4. Conclusion-The oil spill caused by the container ship T. S. Taipei, which occurred during March 2016, was used as a case
study for testing the capability of the proposed oil-tracking strategy, especially for events that happen very close to the
coast. When the wind data from the nearby Fugui Cape buoy were used, the simulated fouled coastline obtained from the
SCHISM was very similar to that obtained from the field observation provided by the EPA, Taiwan. However, the
contaminated coastline predicted using the wind data obtained from the WRF model was much shorter than the observed
.The reasons for this difference were revealed after comparing the simulated and observed wind speeds and wind
directions. Because the ship was stranded very close to the coast, the oil slicks moved to the coast within 1 day of the
event. In the future, the proposed strategy should be applied to oil-spill events occurring on the open sea to examine its
suitability for wider applications.
5. Aim- Quantifying salt marsh shoreline erosion
Methods-
1. Landscape background-The predominately organic surface soils originating from these plants overlie layers
of sand, silts and clays far below the root zone, and deposited hundreds, if not thousands of years ago. The
roots and rhizomes within the modern organic layer helps to hold the overlying marsh soil together
2. Aerial imagery- We used readily-accessible and credentialed aerial imagery to measure the width (east to
west) and length (north to south) of 46 islands.
3. Categories of oiling- We used two categories of exposure to the 2010 BP Macondo oil: oiled and unoiled.
The definition was based on the multi-agency damage assessment operations called SCAT.
4. Sampling design and analysis- We compared the erosion of oiled and unoiled islands before and after the
2010 DWH oil spill. One of the reasons for this sampling design is that we were concerned that the final
distribution of oil might be directly related to variance in the shoreline erosion rates occurring before the spill.
Turner, R. E., McClenachan, G., & Tweel, A. W. (2016). Islands in the oil: Quantifying
salt marsh shoreline erosion after the Deepwater Horizon oiling. Marine Pollution
Bulletin, 110(1), 316-323.
6. Tools-
1. Linear measurement tool to measure the east-to-west and north-to-south dimension of a geomorphic
feature of each island appearing in all photos. Not all photos are precisely and accurately registered with
each other, and so a measurement was positioned in reference to an anchor point (the geomorphic feature)
that remains in each photo.
2. SCAT (Shoreline Cleanup Assessment Technique)- The SCAT assessment has six color-coded categories
to indicate the degree of oiling of the shoreline: red, orange, yellow, green, light green, and blue that are
equivalent to heavy, moderate, light, very light, trace and no oil, respectively.
Conclusion-
The summary observation is that the average oiled and unoiled island width, length and erosion rate before the
oil spill were similar, and that the result of the oil spill was about 275% of the erosion rate of the oiled islands in
the first 8 to 12 months after the oil spill, compared to the unoiled islands. The average enhanced erosion rate for
the oiled islands declined thereafter, and was about twice the average erosion rate of the unoiled sites over 2.5
years. There was no evidence of reversal in the aggregate, although temporary re-vegetation may have taken
place between the photographed intervals at individual islands.
7. Aim- This study aims to examine the maximum warning potential function's sensitivity characteristics to its
input parameters. This will help identify the subset of more important factors in terms of sensitivity. The outcomes
will offer insightful input on how a tsunami warning system is functioning.
Methods-
1. Model:
The locations of the six wave-rider buoys are denoted by bw; w = 1;…; 6: The latitudes and longitudes for
these positions were given. The representative locations of the generation points are denoted by gu; u =
1;…; 18. Major representative population centres were selected to provide a general estimate of the
population at risk in the Pacific Ocean. These are located at points denoted by pv; v = 1;…; 27.. The
population at each point pv is denoted by pv; v = 1;…; 27,
.
Braddock, R. D. (2003). Sensitivity analysis of the tsunami warning potential. Reliability
Engineering & System Safety, 79(2), 225-228.
8. Let
The total number of detectors, Y, may be limited by capital or maintenance costs. The vector y of 0s and 1s
then represents a particular deployment of detectors.
Now consider the generation of a tsunami at time t = 0 at the generation point gu: Let
where t*u (y) is the travel time of the tsunami from gu to the nearest occupied wave-rider buoy, td; the
detection time for processing, detecting and signalling to confirm the generation of the tsunami, rv; the reaction
time of the emergency services and population to move to safety, tu,v is the travel time of the tsunami from the
generation point to the population.Note that t*u (y) and tu,v can be calculated from travel time charts . Also let
9. The total warning potential is then defined as
Then the problem can be formalised as
The optimal solution may be found using curtailed enumeration
10. 2. Sensitivity Analysis:
The function depends on t*u (y), td, rv and tu,v; all of which are continuous, and on Y, which is discrete.
The travel times t*u (y) and ttu,v are generally accurate to within 5%. The detection time td varies between 20 and
50 min. The reaction time for the general population varies between 1 and 5 h, depending on time of day, and on
the education and training of the emergency services and general population. The population at risk, pv are
technically discrete but sufficiently large to be taken as continuous. The population numbers are only estimates
and will be assumed to be variable by 50% on the data. The number of buoys is definitely discrete, and
varies discontinuously on Y.
The population at risk, pv are technically discrete but sufficiently large to be taken as continuous. The
population numbers are only estimates and will be assumed to be variable by plus minus 50% on the data in
Ref. [5]. The number of buoys is definitely discrete, and varies discontinuously on Y.
A first-order-effects sensitivity analysis was performed on the function using the One-at-a-Time (OAT)
Morris method [6]. The parameters used in the analysis are shown in Table 1 and the ranges used are given
above. It was found that the results were very similar with respect to variations in the travel times t*u, tp and tuv:
The travel time tp is used to present a generic travel time factor with a value range of 5%. The response times,
rv; were assumed to be independent of location, and were replaced with a uniform response time r*.
11.
12. Result:
The results of the OAT analysis are given in Table 1, where the mean (𝜇); standard deviations (𝛿) and Morris
ranking are given. The Morris ranking is calculated as the Euclidean distance in the (𝜇, 𝛿) plane, and the Morris
ranking is ordered according to this distance. The sensitivity results are also ranked using only the mean; and
listed in the column headed ‘Rank using mean’. There are variations in the rankings, especially among the least
sensitive parameters. These differences are not significant, as the corresponding means or Euclidean distances
are very small. When changes in ranking for factors with 𝜇 ≤ 0:02 are considered, then there are two factors
where the ranking changes by three rank orders. These factors are Factor 2, the response time of the detector,
and Factor 30, or 𝜋26 ;the population of San Francisco. Both factors have the same mean value, but the standard
deviation for Factor 2 is relatively large. The results are also summarised on the (𝜇, 𝛿) plot shown in Fig. 1.
The results show that Factors 3 and 4 are ranked highest with respect to the sensitivity of the tsunami
warning potential. Factor 3 is the response time of the emergency services, and whilst ranked second on the
Morris ranking, it is almost equivalent to Factor 4. Factor 4 relates to the number of warning buoys in the system,
and this is a discrete variable. The warning potential increases rapidly when Y is small, but then displays the
economic law of diminishing return. Thus, when most wave buoy sites are occupied, little increase in warning
potential is obtained by adding another buoy.
13. The third most important factor, Factor 17, corresponds to the population at risk in Tokyo, Japan. This is also
the largest population centre in the data used to construct the warning potential. The Morris Euclidean distance
for Factor 17 (ranked third) and Factor 2 (ranked fourth) are much smaller than for Factor 4 (ranked first) and
Factor 3 (ranked second), and these factors are much less important in the sensitivity analysis. All of the other
population centres have a lower Euclidean distance, and the warning potential is relatively insensitive to
variations in the estimated population. Some have not been ranked (indicated by NR in Table 1) due to the very
small values of m and d obtained in the analysis.
The calculation of the population at risk was only an approximation. It was used to illustrate the optimisation
calculations to determine the optimum number of detectors. This sensitivity analysis indicates that the use of
population estimates is justified in calculating the tsunami warning potential.
The factor t* is ranked fourth in importance by the Morris ranking system. This term represents the
sensitivity of the warning potential to errors in the calculation of travel times of the tsunami. Note that the
Euclidean or Morris distance between Factor 𝜋13 ranked at 3) and t p (ranked 4) is considerable. Errors in the
bathymetry and in calculating travel times will have little effect on the estimated values of the tsunami warning
potential.
14.
15. Aim- This study induced storm surge and associated onshore flooding poses significant danger and havoc to
life, property and infrastructure during the time of landfall. Coastal belt along the East coast of India is thickly
populated and also highly vulnerable to impact of tropical cyclones.
Methodology-
1. The present study used a multi-layer, multi-output feed-forward ANN model trained with the
pre-computed dataset of storm tide and onshore inundation.
2. The same holds good with the combination of transfer function in each layer and training
function. This needs to be determined initially from trial and error method and that can
provide confidence in using the optimum number. For training purpose more than 300
combinations of wind speed during landfall, landfall location (respective latitude and
longitude), translation speed, incident angle of cyclone track w.r.t shoreline, and prevalent
tidal conditions were used as input variables to ANN model.
.
Sahoo, B., & Bhaskaran, P. K. (2019). Prediction of storm surge and coastal inundation
using Artificial Neural Network–A case study for 1999 Odisha Super Cyclone. Weather
and Climate Extremes, 23, 100196.
16. 3. Similar number of multiple scenarios was also generated for storm tide and cross-shore
inundation (generated using above combinations of input parameters) at 200 coastal destinations
which being the output from the ANN model. A three-layered feed-forward ANN with two hidden
layers and an output layer with multiple dimensions.
17. Result:
1. Historical archives of tropical cyclone activity for the North Indian Ocean basin indicates that 1999 Odisha
Super Cyclone was the worst ever-recorded cyclone that made landfall in Odisha State bordering the East
coast of India.
2. The devastation that resulted from this extreme weather event was quite severe in terms of enormous loss
to life and property. The maximum wind speed during the time of landfall was about 260 km h−1. The
storm surge spanned over 100–150 km stretch of coastline with a maximum surge exceeding 6 m found at
the north of the landfall point .
Summary:
Accurate prediction of storm surge and associated coastal inundation during tropical cyclone landfall is an area of
immense scientific interest having direct socio-economic implications. Though sophisticated numerical models
were developed over time for real-time operations, an effective, robust and cost-effective tool is indispensable for
timely warnings and planning operations.
18. Aim- This paper reviews achievements and findings from studies associated with numerical modeling of
tsunami since the many natural climatic disasters and addresses challenges for future advances. The topics
cover improvements in tsunami numerical modeling including multi-physics simulations, applications to source
modeling, hazard assessment and real-time forecasting and warning.
Methodology-
1. Source modeling is still a primary concern for tsunami hazard assessment. Numerical modeling of the
source of the Tohoku earthquake and tsunami illustrated the spatially-concentrated slip distribution and
tsunami.
2. The approach of quantitative (numerical) tsunami forecasting, which was employed into the operational
tsunami warning system in Japan until 2011, is search of precomputed database using focal parameters .
3. A lot of studies on the source modeling of the Tohoku earthquake and tsunami inferred importance of
offshore monitoring for estimating initial water level and fault slip
Sugawara, D. (2021). Numerical modeling of tsunami: Advances and future challenges
after the 2011 Tohoku earthquake and tsunami. Earth-Science Reviews, 214, 103498
19. Conclusion:
1. Tsunami numerical modeling is an essential tool for disaster management and countermeasures. During
the last 10 years, the tool has been improved, validated and applied to better predict tsunami’s behavior
and their impacts, with aids from the groundbreaking Tohoku data.
2. The important role of the tool is to provide implications to bridge the findings from different disciplines,
which is in turn contribute to disaster science.
3. Source modeling is still a primary concern for tsunami hazard assessment. Numerical modeling of the
source of the Tohoku earthquake and tsunami illustrated the spatially-concentrated slip distribution and
tsunami
.