AUVSI 2012 - Natural Disasters: A Future View in the Use of RPAS
1. 1
NATURAL DISASTERS: A FUTURE VIEW IN THE USE OF RPAS
Laura SamsĂł PericĂłn*
Natural disasters like tornadoes, hurricanes, volcanoes, flood, earthquakes, and wildfires are devastating on population and property losses. Some examples from round our world - earthquakes in Japan and New Zealand, Australian floods, Texas and New Mexico wildfires, hurricane Irene, to name a few. In general catastrophes were lower than in 2010, costs for damages were higher, almost $380 billion. Economic loss estimates caused by earthquake in Japan reached $210 billion1. It is one of the costliest natural catastrophes of all time and deadliest of 2011, resulting in more than 15,000 fatalities1. Today concerns are focused on the future, where numbers of victims and damages will occur. Is it possible to save more lives through application of RPA technology? Is it pos- sible to reduce risks to teams involved in SAR? Is it possible to enhance the overall mission operation by applying a force multiplier of drones instead of people in harmâs way?
This paper presents a future view of how RPAs (Remotely Piloted Aircraft) could assist during a natural catastrophe based on continuous near-real-time (100 msec or less) monitoring - day and night, near-real time planning and re- configuration, autonomous decision and recognition, near-real time ground moving element detection and recognition, etc. High level plans are composed by a pilot-in-command at a ground site uploading mission details and target co- ordinates to an RPA. When this RPA arrives in the affected area, autonomous flight and mission parameters take over and deploys and controls a swarm of small RPAs that will perform a wide range of operations, like video surveil- lance, personnel location identification, and Damage Assessment (DA). Similar- ly this paper discusses various sensors capabilities needed to support rescue and assessment under differing conditions, which are IR (SW, MW and LW), NVG, LIDAR, SAR and LLLTV.
INTRODUCTION
Over the last years, natural disasters have been increasing their effects over prop- erties and people, increasing costs for damages and fatalities, and in addition, the global economic crisis seems to have slowed down the development of some coun- tries/continents, at such a point where a question arises: Do natural disasters have a long term effects on economic growth?2 It is a complex question that is not going to be covered in depth but it is a starting point to understand why it is so compelling to
* MSc in Aerospace and Science Technology, PMI Aerospace and Defense CoP Assistant Manager Outreach for EMEA, e-mail: laura.samso.pericon@gmail.com
LAURA-
SAMSOPERICON
2. 2
find a solution to questions such as: is it possible to save more lives? Is it possible to reduce risks of teams involved in evaluation missions? Is it possible to enhance the overall mission operation?
Figure 1. 2011 Arizona and New Mexico âWallowâ fires, NASA MODIS.
Figure 2. 2011 Irene Hurricane: Visible and Thermal image. NASA/NOAA GOES.
Some examples that happened round world during 2011 were - earthquakes in Japan and New Zealand, Australian floods, Texas and New Mexico wildfires (see Figure 1), hurricane Irene (see Figure 2), to show a few.
3. 3
Figure 3. 2011 Natural Catastrophes World Map.
Figure 3 presents the total amount of natural catastrophes during 2011 around the globe, em- phasizing the most significant loss events with bigger circle marks and classifying by colors de- pending on the type of natural disaster. In general catastrophes were lower than in 2010, although costs for damages were higher, almost $380 billion. Economic loss estimates caused by earth- quake in Japan reached $210 billion1.
Figure 4. 2012Strategic Pivot to Asia âPacific3,12.
4. 4
It is one of the costliest natural catastrophes of all time and deadliest of 2011, resulting in more than 15,000 fatalities, not including still missing people, more than 3.000, and presumed dead, in total, more than 20.000*. Today concerns are focused on the future, where numbers of victims and damages will occur, trying to answer the questions stated above Figure 1.
Most of the reported major natural disasters occurred in countries under development which have poor capacity to manage emergency situations and therefore sustain heavy loses of humans and assets. When trying to assess the future behavior of the different regions it is needed to intro- duce different parameters into the equation to get a reliable solution. Figure 4 shows that from 2001 to 2010, an amount of 70.000 people per year deceased in the Asia Pacific (AP) region due to the natural disasters, resulting in 65% of the worldâs total deaths from such causes, note that Asia Pacific contains 61% of the worldâs population. It is important to take into account that 15 of the worldâs 28 megacities and 12 of the top 15 US trading-partners are located in the AP area, thus a disaster will have a real impact to its economy and development.
Figure 5. Sources of Stress3.
Until now only natural disasters have been presented but interaction between physical and technological hazards4 could also cause emergency situations, resulting in the same deadly and costly effects. Japan declared on March 2011 an âatomic power emergencyâ due to the earthquake and tsunami that damaged the Fukushima Daiichi Nuclear Power Plant and caused radioactive leak. These types of synchronous failures, not only affect the concrete region where they happen, they are felt globally, influencing capital markets and the energy industry. Figure 5 presents some of the possible future sources of stress around the world such as oil production regions, nuclear armed states among the different existing processing sites, water stress areas, industrial pollution and waste hotspots, undernourished population, etc.
Now that problems posed by different hazards are presented, it is time to answer the following above questions: is it possible to save more lives? Is it possible to reduce risks of teams involved in evaluation missions? Is it possible to enhance the overall mission operation?
_________________________
* http://earthquake.usgs.gov/earthquakes/eqarchives/year/2011/2011_deaths.php
5. 5
This paper presents how Remote Piloted Aircrafts (RPA) could have a niche with civil author- ities in assessing the damage* caused by the natural disasters and assisting emergency teams in near-real time, day and night and in all-weather condition. In order to perform a damage assess-
ment operation it is needed to count on different visual techniques among the classical ones, thus various sensors capabilities needed to support rescue and assessment under differing conditions
such as IR (SW, MW and LW), Night Vision (NVG, LLLTV), LIDAR and SAR will be dis- cussed.
DISASTER MANAGEMENT: REMOTE PILOTED AIRCRAFTS IN SEARCH AND RESCUE MISSIONS
Disaster management needs are focused on minimizing effects of catastrophes. The strategy disaster management should address are issues related to mitigation, preparedness, response and recovery5. Flexibility must be built into the plan to take into account each disasters differences and key data required to access and or discount the data for prioritization and resource applications. It will be needed to identify the crucial information needs.
When a catastrophe takes place, SAR teams are activated, but the first step, before sending them to the area, is to perform a DA of the situation. This will permit users to choose the appro- priate resources, distribute and prioritize them in an effective way and avoiding unnecessary risks. This assessment is commonly provided through different worldwide organizations dealing with space imagery and also using manned aircrafts carrying imaging payloads and communica- tion systems such as the US Coast Guard, FEMA, etc.
Hurricane Katrina6 in 2005 was the most destructive natural disaster in U.S. history and was evident that there was the need for a more flexible response and a larger Federal role in disaster contingency plans. Some challenges were among national preparedness, the integrated use of communications, logistics evacuation, SAR operations, public health and medical support, for- eign assistance, non-governmental aid, citizen and community preparedness. This experience favored the creation of different international agencies and the enhancement of the existing pro- cesses and creation of corrective recommendations.
The United International Strategy for Disaster Reduction (UNISDR) provides assessment of risk worldwide, but individual countries must trace a map of development and response, pre and post disaster, identifying possible hazard areas, perform risk assessments. UNISDR main task is focused on the disaster risk reduction (DRR) activities and leads the preparation of the Global Platform for Disaster Risk Reduction which is global forum for disaster risk reduction created in 2006.
The International Charter Space & Major Disasters7 encompass different worldwide space agencies such as European Space Agency (ESA), the National Oceanic and Atmospheric Admin- istration (NOAA), the USGS (United States Geological Survey), the Canadian Space Agency (CSA) between others that provide imagery data to other organizations when a catastrophe takes place but this is not enough in order to properly response to the disaster. Its principal aim is to provide a unified system of space data acquisition in the event of a natural or man-made disaster for an immediate emergency response and the organization has covered about 314 disasters in approximately 100 countries around the world.
_______________________________
* Damage assessment includes property damage, loss of life and ground moving element detection and recognition.
6. 6
Figure 7. International Charter Space & Major Disasters Chart7.
One of the U.S. national disaster management agencies is the Federal Emergency Manage- ment Agency (FEMA) that responds for emergencies and disasters, mitigate their effects, reduce risk of loss and prevent future events. In order to obtain damage assessment some satellites con- trolled by different agencies are used. The United States Geological Survey (USGS) Earth Re- sources Observation and Science (EROS) center8, 9 provides information (pre and post disaster) through their satellites Landsat, ASTER and MODIS and from the military side, the National Guard provides support to local civilian authorities. All this data is coordinated by the FEMA above stated.
Figure 8. Satellite Imagery Support8.
7. 7
Figure above presents the common roadmap to obtain satellite imagery support when a disas- ter occurs in United States.
In order to provide complete and quick damage assessment the use of the RPAs is introduced because the most critical span of time for SAR operations after a disaster is the first 24 h and generally, SAR coordination centers donât typically have enough information until the first 48 h. Certainly it is not the first time that an RPA is used for SAR, the devastation caused by Katrina challenged NOAA to develop a program involving the use of advanced technology such as the Unmanned Aircraft Systems (UAS) 10. As such, unmanned aircraft systems used in Iraq and Af- ghanistan were deployed to find people trapped in New Orleansâ buildings destroyed by the hur- ricane Katrinaâs flood waters.
Table 1. Disaster Management Phases9.
Prevention
Preparedness
Response
Recovery
RPAs can monitor disaster prone areas before situations become critical.
RPAs can assess disasters during their occurrence.
RPAs can help in saving lives, property and help emergency teams.
Table 1 shows how RPAs could help in disaster situations through the phases of Prevention, Preparedness, Response and Recovery. At the same time technological and other kind of chal- lenges must be addressed to safely operate the RPAs in the national airspace. These include relia- ble air-ground communications, the ability to detect and avoid obstacles, levels of autonomy, massive data handling in near-real time, propulsion solutions, pilots training, etc.
Next section covers the definition and characteristics of remote sensing methods, some of the- se challenges, the high level requirements and the architecture of a scenario using RPAs and their role within an emergency event.
REMOTE SENSING
It is defined as the science of acquiring data about the Earthâs surface without contact with it*. In other words, the study of the Earthâs surface using passive and active remote sensors that gath- er information by measuring the electromagnetic (EM) spectrum that is reflected emitted and absorbed by objects in various spectral regions, from gamma-rays to radio waves. A passive sensor consists in an array of sensors which record the amount of EM radiation emitted by a sur- face, on the contrary, an active sensor transmit pulse energy to the object and measure the radia- tion that is reflected or backscattered from it. Some examples of active technologies are LIDAR, SAR/MMW, etc.
_______________________________
* American Meteorological Society, Glossary http://amsglossary.allenpress.com/glossary/search?id=remote-sensing1 (accessed 07/07/2012)
8. 8
Electromagnetic Spectrum
Figure below presents the electromagnetic spectrum that is divided from the shorter wave- lengths (gamma-rays and x-rays) until longer wavelengths (radio waves).
Figure 9. Electromagnetic Spectrum.
(Credit: Canada Center for Remote Sensing)
Understanding the characteristics of the electromagnetic radiation in terms of their wavelength and frequency is crucial to understanding the information to be extracted from remote sensing data. Some of the different spectrum ranges will be further discussed in the coming sections.
Spatial Resolution, Spectral Resolution and Pixel Size
The distance at which one image is taken gives insight about the detail of information that the sensor will acquire.
Figure 10. Remote Sensing Acquisition.
(Credit: Canada Center for Remote Sensing)
Resolution of the digital images is given by terms such as spatial resolution, spectral resolu- tion and pixel size. Spatial resolution refers to the smallest size that can be detected. In the case of the passive sensors, this magnitude depends on their Instantaneous Field of View (IFOV). It is defined as the angular cone of visibility of the sensor (A) and determines the area on the ground
9. 9
which is âseenâ at a given altitude at one particular moment in time (B). The size of the area (reso- lution cell) viewed is equal to IFOV multiplied by distance from the ground to the sensor (C).
Spectral resolution is given by the spectral response that characterizes the reflectance /emittance of an object over a variety of wavelengths. In order to distinguish certain details it is needed finer wavelength ranges, thus a higher spectral resolution, narrower bands over varying conditions such as rain, snow or night11.
A pixel is the smallest unit of an image; it is normally square and represents the smallest re- solvable part of an image. The Ground Sampling Distance (GSD) is the distance between pixel centers measured on the ground, i.e. 1.0 m GSD means that each pixel represents a ground area measuring 1 m x 1 m.
Multi spectral sensors are remote sensing systems that record data or energy over separate
wavelength ranges at various spectral resolutions. Natural disasters are complex environments where advanced multi-spectral sensors called hyper-spectral sensors11 will be needed as they can detect hundreds of narrow spectral bands throughout the visible, near-infrared, and mid-infrared portions of the electromagnetic spectrum.
Figure 11. Concept of Imaging Spectroscopy.
(Credit: Randall B. Smith, Ph.D â MicroImages, Inc.)
CHALLENGES, HIGH LEVEL REQUIREMENTS AND ARCHITECTURE
Past experiences in SAR operations, i.e. during the Wenchuan earthquake5 or after the Katri- na hurricane as commented above, have shown that it is needed a better prioritizing management of the emergency teams and services over the affected areas.
When a disaster takes place13, authorities need to perform a damage assessment to reestablish communications on the area and to send emergency teams to the affected spots to rescue survi- vors. This assessment is performed using high resolution aerial and satellites images together with information from the ground forces when they get into the region. Finally, acquired data must be
10. 10
compared with previous earth observation spatial data bases to detect changes. This implies a huge amount of data to be fused into the algorithms and to be sent in near-real time to the HQ. Quantum computers could be the solution but they still are under development. Concerning im- aging technologies they are discussed in the following sections.
Autonomy is another challenge to face on, as some operations are thought to require some different grade of autonomy. An autonomous system is self-directed toward a goal with no exter- nal control but governed by laws and strategies that direct their behavior, facing unpredictable situations and reacting accordingly. This implies that the system must be able to sense and under- stand the environment; although such capabilities are not fully available at that moment, some advances in computational intelligence (neuronal-fuzzy networks) could help in the future. One extension of the autonomy concept is the cooperation between autonomous systems; this means coordination between them to achieve common goals, permitting a better management of the services and emergency teams. This is what it is introduced with the small RPAs swarm con- cept14, 15, 16.
Automatic target detection (ATR) and identification implies determining the visual and its distinguishing characteristics, and using this signature, the system automatically identifies the object as the probable target. Usually this is done following a centralized ATR strategy but the architecture that it is proposed here considers using swarming strategies for conducting ATR in a distributed form. It will be needed to enhance the existing mapping SW and/or develop new SW packages and identification data bases in order to provide identification and assessment in near- real time.
But what if we do not have reliable communication channels? At that point, it is needed to improve the bandwidth efficiencies, systems requiring less power and with less weight. Small RPAs use Line Of Sight (LOS) communications while larger platforms use LOS and BLOS (Be- yond Line Of Sight), i.e. the Global Hawk or the Predator.
On the other hand, human factors will play an important role in all the operative integrating RPAs, thus requiring training for the Pilot-In-Command (PIC) and other personnel on ground in order to interpret all this amount of information and to control the RPA system.
In this proposal a RPA with capacity to transport on-board a swarm of small RPAâs is added to the air segment to support and perform different SAR operations.
Some of the proposed high level requirements of the system are the following:
ï· Damage assessment of 1 square Kilometer per second, day/night in all weather condi- tions.
ï· Provide near-real time assessment to Ground Control System (GCS)/HQ.
ï· Provide Beyond Line of Sight (BLOS) capability.
ï· Provide operation capability when GPS is not available.
A sub set of requirements will be composed by:
ï· Continuous near-real time monitoring, planning and reconfiguration.
ï· Autonomy level of decision.
ï· Ground moving target detection and recognition.
11. 11
Figure 12. Proposed High Level Architecture.
There will be two segments (see Figure 12): Air and Ground connected through a data link channel. The Air Segment will be composed by the classical reconnaissance aircraft, space based imagery and the proposed RPA system; at the sub component RPA swarm, composed by a group of collaborative small unmanned machines. The Ground Segment will be composed by the Ground Control Station (GCS)/HQ, the Search and Rescue teams, the Alarm Center Coordination (HQ), space and disaster agencies and hospitals/telemedicine teams.
Air Segment Operative
Classical reconnaissance aircrafts, equipped visible and IR (MWIR and LWIR) sensors, and weather satellites provide the needed generic assessment from high altitudes in order to determine the extent and the possible collateral damages, but they need additional images at lower altitudes to generate an accurate 3D map at the HQ.
Once the emergency event takes place, the RPA with a swarm of small unmanned machines on-board, takes off from the HQ or from a selected air force airport towards to the catastrophe area. From the ground, a pilot-in-command can upload new parameters and target coordinates to the RPA (reliability -- trajectory concept) reconfiguring in near-real time the operative. When the RPA reaches the emergency area, the system begins identifying the type of disaster, relevant
12. 12
topographic elements, not destroyed man-made structures or it uses the geodetic points as refer- ences to map the destroyed area. In case any of these things were not possible to use, the RPA would throw some of the swarm components to the ground to act as beacons. These beacons could be used as communications points, reference points for mapping from the air, etc. On the other hand, the rest of the swarm components will begin constructing a mobile communication mesh inside or outside the emergency area, depending on weather conditions. Other options will include the use of the RPA itself, satellites or blimps as communication relays (i.e. BLOS). This will be challenging, as the RPA/blimp wonât be able to support strong winds or extreme weather so, efforts to find a reliable solution should be put here.
When communications are established, the coordination with the HQ takes more relevance and first imaging information is sent to the emergency center and integrated with all the other data sent from manned aircraft and satellites.
It is time to coordinate the swarm of small RPAs and assign them different collaborative tasks like damage assessment and target detection and identification. To conduct these tasks it is pro- posed a distributed system17 among the traditional multispectral/hyper spectral sensors, consisting in a wide-area airborne surveillance sensor that permits the emergency teams to choose the image from different angles and broadcast them at the same time. This would enhance operations in terms of data acquisition as, the operator from HQ, would be able to decide where to point the cameras (E/O, IR, LIDAR, MMW/SAR, NGV/LLTV) from the RPAs or from the swarm compo- nents, saving bandwidth, just choosing which data record and download.
Ground Segment Operative
Once first images arrive to the alarm center control (HQ), they are compared with previous global earth observation data bases, and the first 3D holographic image of the area is displayed.
Figure 13. 3-D Holographic Disaster Area Representation.
(Credit: Army Research Laboratory)
This type of image is more intuitive and natural than the typical 2-D display and permits sim- ultaneous viewing of a group people which improves damage assessment and planning in general. The pilot-in-command will be able to control, monitor and upload new plan operations to the RPA and/or to the swarm components just pinpointing into the 3-D hologram to the needed re- source. It is important to count with a well-integrated GIS platform in relation with spatial data and crisis management systems18.
13. 13
IMAGING SENSOR TECHNOLOGIES
Damage assessment and target identification and recognition are based on the acquired imag- ing data through different types of sensors19. It is important to take into account at this point, how particles and gases in the atmosphere can affect radiation signal before arriving to the Earthâs surface resulting in scattering and absorption effects.
Figure 14. Energy/Radiation vs Atmospheric Windows.
(Credit: Canada Center for Remote Sensing)
Figure 14 shows the relation between sources of energy/radiation and the atmospheric win- dows, which are the areas not affected by above mentioned effects and from this it is possible to extract the most effective wavelengths for remote sensing.
The proposed solution is based in the distributed combination of the following technologies: Electro-Optical (E/O), infrared (IR), Laser Imaging Detection and Ranging (LADAR or LIDAR), MMW (Millimeter Wave radar)/SAR (Synthetic Aperture Radar), NGV (Night Goggles Vi- sion)/LLLTV (Low Light Level Television); pursuing the capability to look at a natural disaster at the same time through different wavelengths, depending on the environmental conditions and catastrophe type.
Infrared (IR) Sensors
IR systems create an image from the observed objects using IR radiation emitted or reflected by them, and they are passive, less prone to jamming, and provide night-time imaging capabili- ties.
Table 2. IR sub division scheme20.
Division Name
Abbreviation
Wavelength
Characteristics
Near IR
NIR
IR-A DIN
0.75-1.4 um
This band is the closest to the visible light. Im- age intensifiers are sensitive to this area. Night vision device/LLLTV/image intensifier (IIS).
Short-
SWIR
1.4 â 3 um
The resolution is good but it needs illumination
14. 14
wavelength IR
IR-B DIN
to get the reflected IR.
Can penetrate fog, smoke, and atmospheric haze at long range and remains unaffected by thermal crossover.
Can penetrate glass.
Mid- wavelength IR
MWIR
IR-C DIN
3-8 um
Image appears hazy and with low resolution, sensitive to rain. They can see through smoke, fog, dust. No vision trough glass.
Long- wavelength IR
LWIR
IR-C DIN
8-15 um
This is the âthermal imagingâ region and no external light or thermal source (sun, moon, thermal illuminator) is required. Forward- looking IR (FLIR) systems provide an oblique perspective rather than the nadir Earthâs surface provided by the classical thermal imaging sen- sors11.
Attenuation factors shall be taken into account due to humidity levels of the atmosphere.
They can see through fog, dust, smoke but reso- lution is low. Sensitive to rain.
No vision trough glass.
Far IR
FIR
15-1000 um
Terahertz laser
Actually different IR division approaches exists but for the purpose of this paper a precise di- vision of IR shown in Table 2 will be used. The IR spectrum ranges from around 1.7 to 1000 . NIR and SWIR sometimes are called reflected infrared (use of external illumination) while MWIR and LWIR are referred as thermal infrared (use of emitted radiation). Concerning the NIR range; it encompasses image intensifier systems (IIS) and Low Light Level Television (LLLTV) cameras.
Figure 15. Fire fighting I (Credit: FLIR).
15. 15
Figure 16. Fire fighting II (Credit: INAER Spain).
A couple of examples using this portion of the spectrum (LWIR/FLIR), at different height and taken during a firefighting are presented above. This technology is used to assist emergency teams on the ground, to locate hot spots, people inside buildings, etc21, 22, and 23.
Other natural disasters, such us volcanic eruptions, implies not only geophysical changes (i.e. displacements) but also geochemical. Satellite sensors can provide a generic view of the event, but IR sensors in an aircraft can be used then to monitor pre and eruption phases 24 and data, such as lava volumes, temperature distributions, can be extracted between other few25.
Figure 17. Eyjafjallajökull volcano (Iceland) on explosion (Credit: FLIR).
Figure 17 presents a thermal shot during a volcanic eruption. It should be also taken into ac- count different factors (atmospheric effects, volcanic gas and ashes, etc.) that could lead to an underestimation of the temperature of the object. Generally, for this kind of surveys, thermal strip from 8 to 15 is used but, depending on the application, i.e. high temperature measurements, it could be better to use the MWIR range, due to the greater sensivity and reduced atmospheric effects.
In the case of earthquakes, among RPA on-board radar sensors, it could be useful to use visi- ble and NIR imagery from satellite systems (i.e. LANDSAT) to detect active fault zones for plan- ning purposes.
Microwave
Radars are active sensors that produce their own illumination: the sensor illuminates the ter- rain and receives the reflected signals producing an image. They are useful for geologic and geo- morphologic identification characteristics.
This portion of the spectrum, from 300 GHz (1mm) to 300 MHz (1m) with some fuzziness at the ends, is becoming more interesting for different applications. Systems equipped with this kind
16. 16
of sensors can provide damage assessment and perform TI and recognition tasks as they permit vision through clouds, fog, sand, etc.
Figure 18. Microwave Spectrum.
(Credit: Canada Center for Remote Sensing)
Figure 18 presents the microwave spectrum and it could be observed the different bands. Common used systems comprise X-band, K- band and Ka-band wavelengths but the most com- mercial equipment is the X-band airborne radar system.
One of the systems within this category is the Synthetic Aperture Radar (SAR) 26, 27, used to assess, for example, in different earthquakes phases (ground deformation detection) and for target identification. The use of bistatic techniques, consisting in separating the emitter and the transmit- ter with a considerable distance, would back up the current monostatic system carried by the RPA in case of problems.
The range of frequency located around 40 GHz (7.5 mm) to 300 GHz (1 mm) belongs to the MilliMeter-Wave (MMW)28 radiation and it is beginning to be explored for natural disaster appli- cations. It can provide all-weather, day/night imaging including vision through fog, clouds, smoke and dust taking into account the respective atmospheric limitations.
Shorter wavelengths are terahertz (THz) radiation and, from a point of view of an opticistâs, it is also known as Far IR (see Table 2).
Light Detection and Ranging or Laser Imaging Detection and Ranging (LIDAR)
LIDAR is similar to radar with the difference that the sensor sends and receives pulse of light instead of radio waves. LIDAR uses shorter wavelengths of the electromagnetic spectrum, typi- cally the ultraviolet, visible and NIR, but is highly sensitive to aerosols and cloud particles, thus it is not all-weather (some haze is manageable). This active imaging technology could be used day/night/shadow from airborne platforms to predict and assess floods, monitor damage and per-
17. 17
form assessment after earthquakes, predict and monitor volcanoes, also coupled with GIS data bases/autopilot shall permit the RPA to take-off, navigate, transit and select landing site and land for medical disposal, to detect heights of trees, dense foliage, to detect the distance to obstacles in front the RPA, to name a few.
Figure 19. 2010 LIDAR Chile Earthquake and Tsunami imagery.
(Credit: Oregon State University Geomatics Lab).
Damage assessment consists on the production of Digital Terrain Maps (DTMs) of affected or inaccessible areas. Figure 19 presents the results of using LIDAR to reconstruct the 2010 Chilean earthquake and tsunami scenario. It could be observed the high quality of the obtained images that permits to fly virtually over the damaged area with a 3D point view.
Some of the ongoing developments are focused to reduce cost, size, power, processing time resulting in systems able to create a near-real time DEM image of the surveyed region and obtain a denser cloud of points from higher altitudes 29, 30, 31, 32, 33,35.
Night Vision
They are defined as systems based on image intensifier tube technologies that have the ability to see in low light conditions, like the known Night Vision Goggles (NGV) and the above com- mented LLLTV. They operate near the visible light and at NIR ranges from 0.4 to 1.0 .
Figure 20. Comparison under poor night contrast conditions and thermal imaging.
Figure 21. Comparison under smoke/fog night conditions and thermal imaging.
19. 19
CONCLUSIONS
Damage assessment and mission operations in natural disasters could be enhanced introducing the RPA technology into the disaster management cycle.
Collaborative and distributed imaging and disaster-event oriented will represent a major ad- vancement with respect to the existing capabilities.
Near-real time processing capabilities could be enhanced using new algorithm compressions and machines.
A better and complete integration between GIS and different spatial data bases is a need in or- der to enhance the disaster management cycle.
ACKNOWLEDGMENTS
The author would like to acknowledge the company INAER España, Search and Rescue (SAR) España, FLIR company, USGS, NOAA, Canada Remote Sensing Center, International Charter Space & Major Disasters, American Society for Photogrammetry and Remote Sensing (ASPRS), the Insitute of Geomatics (Spain) for their valuable help providing the author permis- sion to use photos, data and information. In addition the author would like to acknowledge in- volved people that gave their time and effort during this research.
.1, 2 REFERENCES
1 NOAA National Geophysical Data Center Report, âJapan Earthquake and Tsunami, March 2011â, 2011.
2 M. Vesal,âLong Run effects of Natural Disastersâ, London School of Economics, UK, 2011.
3 General J.F. Amos, Commandant of the Marine Corps, âAmericaâs Expeditionary Force in Readinessâ, Naval War College, 2012.
4 United Nations, âGlobal Assessment Report for Disaster Risk Reduction 2011â, 2011.
5 Z. Zhang, Y. Zhang, T. Ke, âRemote Sensing in the Wenchuan Earthquakeâ, Photogrammetric Engineering and Re- mote Sensing Magazine, 75(5): 506-513, 2009.
6 K. S. Pratt, R. Murphy, S. Stover, âCONOPS and Autonomy Recommendations for VTOL Small Unmanned Aerial System Based on Hurricane Katrina Operationsâ, Journal of Field Robotics 26(8): 636-650, 2009.
7 T. Stryker, B. Jones, âDisaster Response and the International Charter Programâ, Photogrammetric Engineering and Remote Sensing Magazine, 75(12): 1342-1344, 2009.
8 J. Dickerson, âInternational Charter Briefâ, International Charter Space and Major Disasters, MGIO, 2009.
9 K.A. Duda, B. K. Jones, âUSGS Remote Sensing Coordination for the 2010 Haiti Earthquakeâ, Photogrammetric Engineering and Remote Sensing Magazine, 77(9): 899-907, 2011.
10 S. Summers, âPotential Role of UAS in Hurricane Predictionâ, 60 th NOAA Interdepartmental Hurricane Confer- ence, 2006.
11 Canada Center for Remote Sensing, âFundamentals of Remote Sensing Tutorialâ, Canada Center for Remote Sens- ing.
12 United Nations Economic and Social Commission for Asia and the Pacific (ESCAP), âStatistical Yearbook for Asia and the Pacific 2011â, 2011.
13 US Department of Commerce, NOAA - National Oceanic and Atmospheric Administration and National Weather Service, âA guide to F-Scale Damage Assessmentâ, Maryland, 2003.
20. 20
14 J. A. Sauter, R. S. Matthres, A. Yinger, J. Robinson, J. Moody, S. Riddle, âDistributed Pheromone-Based Swarming Control of Unmanned Air and Ground Vehicles for RSTAâ, Proceedings of SPIE Defense and Security Conference, Orlando, 2008.
15 P. Miller, M. Goodrich, âMini, Micro and Swarming Unmanned Aerial Vehicles: A Baseline Studyâ, The Library of Congress, 2006.
16 L. Lidowski, Captain, USAF, âA Novel Communications Protocol using Geographic routing for Swarming UAVs performing a Search Missionâ, Air Force Institute of Technology, Wright-Patterson AFB, Ohio, 2008.
17 D. Deptula, Lt. Gen, USAF,âAir Force ISR in a Changing Worldâ, 2010.
18 V. Paelke, K. Nebe, F. Klompmaker, H. Jung, âMulti-touch Interaction for Disaster Managementâ, GeoViz, 2011.
19 Federal Aviation Administration (FAA), AC 150/5220-24, Airport Foreign Object Debris (FOD) Detection Equip- mentâ, 2009.
20 J. Bymes (ed.), âUnexploded Ordnance Detection and Mitigation: Volatile Compounds Detection by IR Acousto- Optic Detectorsâ, Springer Science+Business Media B.V., 2009.
21 R.H. Moose, âCovering the Homeland: National Guard Unmanned Aircraft Systems Support for Wild land Fire- fighting and Natural Disaster Eventsâ, Naval Postgraduate School, California, 2008.
22 J. Theodore, âGIS Analysis and Remote Sensing Support to Southern Wildfire Responseâ, Photogrammetric Engi- neering and Remote Sensing Magazine, 997-999, 2004.
23 L. Merino, F. Caballero, J.R. MartĂnez-de-Dios, I. Maza, A. Ollero, âAn Unmanned Aircraft System for Automatic Forest Fire Monitoring and Measurementâ, Journal Intelligent Robot Systems, Vol. 65: 533-548, 2012.
24 L. Spampinato, S. Calvari, C. Oppenheimer, E. Boschi, âVolcano surveillance using infrared camerasâ, Earth- Science Reviews, Issues 1-2, pp. 63-91, 2011.
25 C. Van Westen, âRemote Sensing for Natural Disaster Managementâ, ISPRS, Vol. XXXIII, Part B7, Amsterdam, 2000.
26 A. Popescu, C. Patrascu, I. Garat, âDamage Assessment on SAR Analysis Images: Flood Scenario From Romanian Eastern Carpathian Regionâ, IWSSIP 16th International Conference on Systems, Signals and Image Processing, 2009.
27 Y. dong, Q. Li, A. Dou, X. Wang, âExtracting Damages caused by the 2008 Ms. 8.00 Wenchuan Earthquake for SAR Remote Sensing Dataâ, Journal of Asian Earth Sciences, 40: 907-914, 2010.
28 R.W. McMillan, âTerahertz Imaging, Millimeter-Wave Radarâ, U.S. Army Space and Missile Defense Command, Huntsville, USA, 2005.
29 RLuna, D. Wronkiewicz, P. Rydlund, G. Krizanich, D. Shaver, âDamage Evaluation of the Taun Sauk reservoir Failure using LIDARâ, 2005.
30 E. Coppock, D. Nicks, R. Nelson, S. L. Schultz, âReal-Time Creation and Dissemination of DEM Products using Total Sightâ, ASPRS Annual Conference, USA, 2011.
31 F. Wang, âLIDAR Data Acquisition Methods in emergency Management Applicationsâ, 19th International Confer- ence on Geoinformatics, 2011.
32 Y. Lin, J. HyypĂ€, A. Jaakkola, âMini-UAV-Borne LIDAR for Fine Scaling Mappingâ, IEEE Geoscience and Remote Sensing Letters, 8(3), 2011.
33 J.C. Trinder, M. Salah, âDisaster Change Detection Using Airborne LIDARâ, Surveying and Spatial Sciences Bien- nial Conference 2011, New Zealand, 2011.
34 P. Molina, I. Colomina, T. Vitoria, P. F. Silva, Y. Stebler, J. Skaloud, W. Kornus, R. Prades, âEgnos-Based Multi- Sensor Accurate and Reliable Navigation in SAR Mission with UAVsâ, UAV-g 2011, Switzerland.
35 Q. Abdullah,âLidar Fundamentals and Applicationsâ, Furgo EarthData, Inc., ASPRS 2012.