Tornados, over the last 56 yearsthe length of the line represents the intensity on the Fujita scale
Early Warning Systems for tornados: GOES series of satellites (70s) visualizing storms. NEXRAD series of Doppler radars (80s), Skywarn spotters a form of crowdsourcing, watching for actual funnels to come down out of these cloud systems in seconds
Later developments sought to get inside storms (literally) with Doppler on wheels, mobile mesonets CASA radars small building or tower mounted radars to see just a few kilometers away for an earlier prediction, and sensors designed to allow a tornado to override it.
This has sparked an obsession that’s a mix of academic curiosity, humanitarianism and adrenalin…. Despite all this investigation, the current warning time is still only 13 minutes with a 70% false positive rate (VORTEX data).
I found that the workhorse of warning technology used in all disasters are satellites. Whether transmitting telemetry from remote volcanic sensors, seeing fault lines under the water, or analyzing data on multiple spectral wavelengths for analysis of moisture content or volcanic plumes.
And the new kid on the block are Unmanned Aerial Vehicles. Theyre cheaper, can get closer and directly measure things. NASA flying drones into volcanic plume to verify the spectral measurements of satellites, Mapping (Drone adventures) in the Haiti and the Philippines, SAR uses, and a Camp Roberts experiment (in flight) for FEMA taping a regular cell phone to measure cellular signal strength. This is a federally funded exchange of ideas between federal agencies, academics and industry without all the binding restrictions.
What started with white boards and radios and triage tags now includes bar code readers to RFIDs (radio frequency identifiers) to Electronic Data Collection tools are being used to track patients, notifying delivery vehicles (ambulances) and depots (hospitals) to avoid gluts or scarcity. These use either cellular or WiFi connections.
One such system (Wireless Internet Information System for Medical Response in Disasters) uses a scalable WLAN with mesh nodes connecting EDCs triage tablets and some remote visualizing platforms with triage stations, ambulances, hazmat, police & fire teams as well as hospitals.
Doing assessments started as manuals and paper forms to “walking papers” which is a manually updatable map to Electronic Data collectors such as this one that transmits either on cellular or SMS as needed. (datadyne), More recently studies in crowdsourcing tweets and photos from the disaster can supplement the manual efforts.
Includes overhead imagery, both satellite manned and UAV, hand done surveys and crowd sourcing reports (IFRC & Twitter). And Mapapp from, pat Meiers group to ascertain the status of individuals, and if theres trouble indicate what that is.
Arerobust, and usually provided by radios with or without repeaters, cellular & especially satellite technology as well as wifi networks for the response community. This is a dirigible with antenna onboard used out at Camp Roberts, in California.
In addition more effort is being placed lately on cellular network restoration with cells on wheels, & light trucksand base transceiver stations, as well as providing wifi to survivors, as here in the Philippines. Survivors predominately are using SMS at this time where its available.
Not as robust. The cluster system in 2005 was a big leap forward in organizing the collection and sharing of information into specialty specific groups, followed by the creation of common operating data sets, & common symbols for mapping, which is huge because if you don’t have common language terms, etc you have this (tower of babel) everyone only able to understand themselves. The creation of the UNs 3 W (whose doing what where) and One Response website, which is now the Humanitarian response website without the inter cluster emphasis ( I understand its to be reformatted in the next few months to be more of an intercluster communication tool) and Epic, a vastly underappreciated technology at least in its first iteration that was meant to facilitate sharing of need assessment level data between response groups. This is one area where my research showed groups working for the same funding resources, tend to compete rather than collaborate, and in some instances more basic human issues get in the way.
Wastes transportation assets
Besides doing all the above functions in pretty chaotic conditions -
In the field of Natural Language Processing, this is the holy grail. Google uses search terms to produce things like Google Flu and beats the CDC by approximately a week on flu outbreaks There are currently two methods for interpreting texts in various slanguages using abbreviations and slang: probalistic topic models and parallel bilingual corpora. Alternatively depending on native speakers to crowdsource translations is known as mechanical turking. It has been found to be impractical for sustained efforts.
You train an algorithm called a binary classifier to search for all terms related to an illness using categories of ailments, symptoms, signs and treatments reduces the number of tweets significantly allowing for computer examination of topics (Ailment Topic Aspect Model + Adam and Paul Drezde, JHU).
The Artificial Intelligence for Disaster Response from the Qatar Computing Research Institute uses crowdsourcedindiviuals to create topics. The computer can then process tweets automatically which can then be examined.
Using a device that could provide an accurate depiction of the disaster with a real time location and identity of every group in the area would provide situational awareness to all. Operations centers, first response groups, etc.
Uses of ICT to Impact Organizational
and Operational Challenges in
Current uses of ICT in disasters
• Patient tracking (akin to logistics )
• Assessments - epidemiologic / need / damage
• Geolocating areas of need
• Communications within organizations
• Capturing and sharing data
• Incident management
LA Lenert, D Kirsh, WG Griswold, C Buono, J Lyon, R Rao, TC Chan. Design and evaluation of a wireless electronic health records system for
field care in mass casualty settings. J Am Med Inform Assoc. 2011;18(6):842-52. Epub 2011 Jun 27
Challenges to Sharing Information
• Structure and motivation of the agencies
• Leadership capabilities
• Social / political structures of the affected area
• Needs drive communication – needs for fundraising
lead to competitive motivations
Signs of poor collaboration
• Process is confused with results
• Misallocation of resources
• Maldistribution of teams
• Resupplies not coordinated between groups
• Survivor problems evolve to a critical state
Leading ICT challenges in disaster response
• perceiving survivors’ needs in near real-time
• assessing damage geographically
• creating greater horizontal and vertical integration of the
Perceiving survivors’ needs in near real time
Perceiving survivors needs – text analysis
• Natural Language Processing - the Holy Grail
• Currently uses two methods:
probalistic topic models
parallel bilingual corpora
• MechanicalTurk method of crowdsourcing translations not
Probalistic topic models
Start with 2 billion tweets
Train a binary classifier to search for all terms related to
“flu” with 5,100 training examples
fever/ sneeze/ cough/ pain, etc
= 1.63 million hits,
Tweets categorized as ailments, symptoms, and
treatments. Excludes “confusers” – “I got Bieber fever”
QCRI’s people trained algorithm classifies large
volumes of texts into various categories for
individualized attention by programs like
micromappers, and CrisisTrackers
Bilateral parallel corpora
Used to teach computers languages
Includes a known language next to the
(essentially the Rosetta stone in cyber age)
Providing integration – horizontal and vertical