This document summarizes an existing review of ontologies for crisis management. The review identified 11 key subject areas related to crisis management concepts (e.g. disasters, infrastructure, organizations). It found 27 existing ontologies that cover these subject areas to varying degrees. Most ontologies were designed for general purposes, disaster archives, or response and relief. The review concluded that while existing ontologies cover important subject areas, some areas are not fully addressed and links between areas need improvement. It provided a basis for developing a new framework and ontology (Cihai) to improve semantic interoperability in crisis management.
Ontologies for Emergency & Disaster Management Stephane Fellah
Ogc meeting march 2014
OGC OWS-10 Cross-Community Interoperability
Ontologies for Emergency & Disaster Management
(The application of geospatial linked data)
Knowledge graphs ilaria maresi the hyve 23apr2020Pistoia Alliance
Data for drug discovery and healthcare is often trapped in silos which hampers effective interpretation and reuse. To remedy this, such data needs to be linked both internally and to external sources to make a FAIR data landscape which can power semantic models and knowledge graphs.
Fairification experience clarifying the semantics of data matricesPistoia Alliance
This webinar presents the Statistics Ontology, STATO which is a semantic framework to support the creation of standardized analysis reports to help with review of results in the form of data matrices. STATO includes a hierarchy of classes and a vocabulary for annotating statistical methods used in life, natural and biomedical sciences investigations, text mining and statistical analyses.
Lessons from Data Science Program at Indiana University: Curriculum, Students...Geoffrey Fox
Invited talk at NSF/TCPP Workshop on Parallel and Distributed Computing Education Edupar at IPDPS 2015 May 25, 2015 5/25/2015 Hyderabad
Discusses Indiana University Data Science Program and experience with online education; the program is available in both online and residential modes. We end by discussing two classes taught both online and residentially and online by Geoffrey Fox. One is BDAA: Big Data Applications & Analytics https://bigdatacourse.appspot.com/course. The other is BDOSSP: Big Data Open Source Software and Projects http://bigdataopensourceprojects.soic.indiana.edu/
Efficient O&G does not suffice in an industry downturn – effective investment in time and effort is required to rise above the pack
Production analysis need not be mystical; it should not be rote
Nuance and subtle variations provide leading indicators into impending production issues
Decline curves, certainly crucial, must be analyzed in context
Case-based, topological analysis, rule inference, curve plotting solutions are common solutions, but fall short
Application of nuance analysis within environment of Data-Intensive Scientific Discovery
Machine Learning encompasses data acquisition, transmission, retention, analysis, and reduction. The expected outgrowth of 24x7 data systems and operations centers is Knowledge Engineering and Data Intensive Analytics AKA Machine Learning. This presentation will develop and apply Machine Learning concepts to the Upstream O&G industry. Specific focus will be given to the fundamental concepts and definitions of Machine Learning along with the application of Machine Learning.
This slide was presented in International the 2015 Conference on Education Research.
I aggregated several my other partial slides and reports to describe adaptive learning model pertaining to concept of learning analytics as well as LOD for curriculum standards and digital resources. There is short introduction to the project of ISO/IEC 20748 Learning analytics interoperability - Part 1: Reference model.
Ontologies for Emergency & Disaster Management Stephane Fellah
Ogc meeting march 2014
OGC OWS-10 Cross-Community Interoperability
Ontologies for Emergency & Disaster Management
(The application of geospatial linked data)
Knowledge graphs ilaria maresi the hyve 23apr2020Pistoia Alliance
Data for drug discovery and healthcare is often trapped in silos which hampers effective interpretation and reuse. To remedy this, such data needs to be linked both internally and to external sources to make a FAIR data landscape which can power semantic models and knowledge graphs.
Fairification experience clarifying the semantics of data matricesPistoia Alliance
This webinar presents the Statistics Ontology, STATO which is a semantic framework to support the creation of standardized analysis reports to help with review of results in the form of data matrices. STATO includes a hierarchy of classes and a vocabulary for annotating statistical methods used in life, natural and biomedical sciences investigations, text mining and statistical analyses.
Lessons from Data Science Program at Indiana University: Curriculum, Students...Geoffrey Fox
Invited talk at NSF/TCPP Workshop on Parallel and Distributed Computing Education Edupar at IPDPS 2015 May 25, 2015 5/25/2015 Hyderabad
Discusses Indiana University Data Science Program and experience with online education; the program is available in both online and residential modes. We end by discussing two classes taught both online and residentially and online by Geoffrey Fox. One is BDAA: Big Data Applications & Analytics https://bigdatacourse.appspot.com/course. The other is BDOSSP: Big Data Open Source Software and Projects http://bigdataopensourceprojects.soic.indiana.edu/
Efficient O&G does not suffice in an industry downturn – effective investment in time and effort is required to rise above the pack
Production analysis need not be mystical; it should not be rote
Nuance and subtle variations provide leading indicators into impending production issues
Decline curves, certainly crucial, must be analyzed in context
Case-based, topological analysis, rule inference, curve plotting solutions are common solutions, but fall short
Application of nuance analysis within environment of Data-Intensive Scientific Discovery
Machine Learning encompasses data acquisition, transmission, retention, analysis, and reduction. The expected outgrowth of 24x7 data systems and operations centers is Knowledge Engineering and Data Intensive Analytics AKA Machine Learning. This presentation will develop and apply Machine Learning concepts to the Upstream O&G industry. Specific focus will be given to the fundamental concepts and definitions of Machine Learning along with the application of Machine Learning.
This slide was presented in International the 2015 Conference on Education Research.
I aggregated several my other partial slides and reports to describe adaptive learning model pertaining to concept of learning analytics as well as LOD for curriculum standards and digital resources. There is short introduction to the project of ISO/IEC 20748 Learning analytics interoperability - Part 1: Reference model.
The webinar explores some of the current opportunities for AI within Life Science and look ahead to what we can expect to see over the coming years. These are the accompanying slides.
Paper 192. in CISTI 2021: OntoDRE: An Ontology For The Requirements...James Miranda
TITLE: "OntoDRE: An Ontology For The Requirements Engineering Decision Process"
TO CITE:
J. W. Pontes Miranda and R. Cristiane Gratão de Souza, "OntoDRE: An ontology for the requirements engineering decision process," 2021 16th Iberian Conference on Information Systems and Technologies (CISTI), 2021, pp. 1-6, DOI: 10.23919/CISTI52073.2021.9476446.
BiBTex:
@INPROCEEDINGS{9476446, author={Pontes Miranda, James William and Cristiane Gratão de Souza, Rogéria}, booktitle={2021 16th Iberian Conference on Information Systems and Technologies (CISTI)}, title={OntoDRE: An ontology for the requirements engineering decision process}, year={2021}, volume={}, number={}, pages={1-6}, doi={10.23919/CISTI52073.2021.9476446}}
The official presentation took place online on 24th Jun 2021 during the "Software Systems, Architectures, Applications and Tools" session. For more information, visit http://www.cisti.eu/
Moving beyond sameAs with PLATO: Partonomy detection for Linked DataPrateek Jain
The Linked Open Data (LOD) Cloud has gained significant traction over the past few years. With over 275 interlinked datasets across diverse domains such as life science, geography, politics, and more, the LOD Cloud has the potential to support a variety of applications ranging from open domain question answering to drug discovery.
Despite its significant size (approx. 30 billion triples), the data is relatively sparely interlinked (approx. 400 million links). A semantically richer LOD Cloud is needed to fully realize its potential. Data in the LOD Cloud are currently interlinked mainly via the owl:sameAs property, which is inadequate for many applications. Additional properties capturing relations based on causality or partonomy are needed to enable the answering of complex questions and to support applications.
In this work, we present a solution to enrich the LOD Cloud by automatically detecting partonomic relationships, which are well-established, fundamental properties grounded in linguistics and philosophy. We empirically evaluate our solution across several domains, and show that our approach performs well on detecting partonomic properties between LOD Cloud data.
Advancing Foundation and Practice of Software AnalyticsTao Xie
Vision Statement Presentation on "Advancing Foundation & Practice of Software Analytics" at the 2nd International NSF sponsored Workshop on Realizing Artificial Intelligence Synergies in Software Engineering (RAISE 2013) http://promisedata.org/raise/2013/
While Graph Databases have come of age, Data Warehousing seems to be broken in an increasing dynamic world. Are Graph Databases a smarter version of Data Lakes?
In this webinar, Andreas Blumauer - CEO of Semantic Web Company discusses various approaches of data and information integration and the role knowledge graphs and taxonomies play in this game.
Numerous organizations already discovered Enterprise Linked Data as a powerful solution for a 360-degree view on various business objects. But how do they solve the big challenge of connecting their data pools in heterogeneous and highly dynamic information landscapes?
Learn more about the manifold application scenarios of linked data and semantic technologies. Dive into your data pools to gain new insights and knowledge!
The Download: Tech Talks by the HPCC Systems Community, Episode 12HPCC Systems
Join us as we continue this series of webinars specifically designed for the community by the community with the goal to share knowledge, spark innovation and further build and link the relationships within our HPCC Systems community.
Episode 12 includes Tech Talks featuring speakers from our community on topics covering exploratory data analysis, geospatial solutions and ECL Tips leveraging the HPCC Systems platform.
1) Itauma Itauma, PhD Candidate, Keiser University - Conducting exploratory data analysis in educational research using HPCC Systems®
2) Ignacio Calvo, LexisNexis Risk Solutions - Big Data and Geospatial with HPCC Systems®
3) Bob Foreman, Senior Software Engineer, HPCC Systems, LexisNexis Risk Solutions - ECL Tip of the Month
On April 11th 2016, Prof. Prof. Henning Müller (HES-SO Valais-Wallis and Martinos Center) presented Challenges in medical imaging and the VISCERAL model at National Cancer Institute in Washington.
Reproducibility in human cognitive neuroimaging: a community-driven data sha...Nolan Nichols
Access to primary data and the provenance of derived data are increasingly recognized as an essential aspect of reproducibility in biomedical research. While productive data sharing has become the norm in some biomedical communities, human brain imaging has lagged in open data and descriptions of provenance. The overarching goal of my dissertation was to identify barriers to neuroimaging data sharing and to develop a fundamentally new, granular data exchange standard that incorporates provenance as a primitive to document cognitive neuroimaging workflow.
For my dissertation research, I led the development of the Neuroimaging Data Model (NIDM), an extension to the W3C PROV standard for the domain of human brain imaging. NIDM provides a language to communicate provenance by representing primary data, computational workflow, and derived data as bundles of linked Agents, Activities, and Entities. Similar to the way a sentence conveys a standalone thought, a bundle contains provenance statements that parsimoniously express the way a given piece of data was produced. To demonstrate a system that implements NIDM, I developed a modern, semantic Web application platform that provides neuroimaging workflow as a service and captures provenance statements as NIDM bundles. The course of this work necessitated interaction with an international community, which adopted and extended central elements of this work into prevailing brain imaging software. My dissertation contributes neuroinformatics standards to advance the current state of computational infrastructure available to the cognitive neuroimaging community.
Presentation by Prof. Dr. Henning Müller.
Overview:
- Medical image retrieval projects
- Image analysis and 3D texture modeling
- Data science evaluation infrastructures (ImageCLEF, VISCERAL, EaaS – Evaluation as a Service)
- What comes next?
Presentation of the article at Workshop of Learning Analytics & Knowledge 2016 in April 25, 2016.
Note: full paper is available on http://www.laceproject.eu/wp-content/uploads/2015/12/ep4la2016_paper_4.pdf
Towards a Project Centric Metadata Model and Lifecycle for Ontology Mapping G...Christophe Debruyne
Christophe Debruyne, Brian Walshe, Declan O'Sullivan: Towards a Project Centric Metadata Model and Lifecycle for Ontology Mapping Governance. Paper presented at iiWAS 2015 on the 13th of December 2015, Brussels, Belgium.
study or concern about what kinds of things exist
what entities there are in the universe.
the ontology derives from the Greek onto (being) and logia (written or spoken). It is a branch of metaphysics , the study of first principles or the root of things.
The webinar explores some of the current opportunities for AI within Life Science and look ahead to what we can expect to see over the coming years. These are the accompanying slides.
Paper 192. in CISTI 2021: OntoDRE: An Ontology For The Requirements...James Miranda
TITLE: "OntoDRE: An Ontology For The Requirements Engineering Decision Process"
TO CITE:
J. W. Pontes Miranda and R. Cristiane Gratão de Souza, "OntoDRE: An ontology for the requirements engineering decision process," 2021 16th Iberian Conference on Information Systems and Technologies (CISTI), 2021, pp. 1-6, DOI: 10.23919/CISTI52073.2021.9476446.
BiBTex:
@INPROCEEDINGS{9476446, author={Pontes Miranda, James William and Cristiane Gratão de Souza, Rogéria}, booktitle={2021 16th Iberian Conference on Information Systems and Technologies (CISTI)}, title={OntoDRE: An ontology for the requirements engineering decision process}, year={2021}, volume={}, number={}, pages={1-6}, doi={10.23919/CISTI52073.2021.9476446}}
The official presentation took place online on 24th Jun 2021 during the "Software Systems, Architectures, Applications and Tools" session. For more information, visit http://www.cisti.eu/
Moving beyond sameAs with PLATO: Partonomy detection for Linked DataPrateek Jain
The Linked Open Data (LOD) Cloud has gained significant traction over the past few years. With over 275 interlinked datasets across diverse domains such as life science, geography, politics, and more, the LOD Cloud has the potential to support a variety of applications ranging from open domain question answering to drug discovery.
Despite its significant size (approx. 30 billion triples), the data is relatively sparely interlinked (approx. 400 million links). A semantically richer LOD Cloud is needed to fully realize its potential. Data in the LOD Cloud are currently interlinked mainly via the owl:sameAs property, which is inadequate for many applications. Additional properties capturing relations based on causality or partonomy are needed to enable the answering of complex questions and to support applications.
In this work, we present a solution to enrich the LOD Cloud by automatically detecting partonomic relationships, which are well-established, fundamental properties grounded in linguistics and philosophy. We empirically evaluate our solution across several domains, and show that our approach performs well on detecting partonomic properties between LOD Cloud data.
Advancing Foundation and Practice of Software AnalyticsTao Xie
Vision Statement Presentation on "Advancing Foundation & Practice of Software Analytics" at the 2nd International NSF sponsored Workshop on Realizing Artificial Intelligence Synergies in Software Engineering (RAISE 2013) http://promisedata.org/raise/2013/
While Graph Databases have come of age, Data Warehousing seems to be broken in an increasing dynamic world. Are Graph Databases a smarter version of Data Lakes?
In this webinar, Andreas Blumauer - CEO of Semantic Web Company discusses various approaches of data and information integration and the role knowledge graphs and taxonomies play in this game.
Numerous organizations already discovered Enterprise Linked Data as a powerful solution for a 360-degree view on various business objects. But how do they solve the big challenge of connecting their data pools in heterogeneous and highly dynamic information landscapes?
Learn more about the manifold application scenarios of linked data and semantic technologies. Dive into your data pools to gain new insights and knowledge!
The Download: Tech Talks by the HPCC Systems Community, Episode 12HPCC Systems
Join us as we continue this series of webinars specifically designed for the community by the community with the goal to share knowledge, spark innovation and further build and link the relationships within our HPCC Systems community.
Episode 12 includes Tech Talks featuring speakers from our community on topics covering exploratory data analysis, geospatial solutions and ECL Tips leveraging the HPCC Systems platform.
1) Itauma Itauma, PhD Candidate, Keiser University - Conducting exploratory data analysis in educational research using HPCC Systems®
2) Ignacio Calvo, LexisNexis Risk Solutions - Big Data and Geospatial with HPCC Systems®
3) Bob Foreman, Senior Software Engineer, HPCC Systems, LexisNexis Risk Solutions - ECL Tip of the Month
On April 11th 2016, Prof. Prof. Henning Müller (HES-SO Valais-Wallis and Martinos Center) presented Challenges in medical imaging and the VISCERAL model at National Cancer Institute in Washington.
Reproducibility in human cognitive neuroimaging: a community-driven data sha...Nolan Nichols
Access to primary data and the provenance of derived data are increasingly recognized as an essential aspect of reproducibility in biomedical research. While productive data sharing has become the norm in some biomedical communities, human brain imaging has lagged in open data and descriptions of provenance. The overarching goal of my dissertation was to identify barriers to neuroimaging data sharing and to develop a fundamentally new, granular data exchange standard that incorporates provenance as a primitive to document cognitive neuroimaging workflow.
For my dissertation research, I led the development of the Neuroimaging Data Model (NIDM), an extension to the W3C PROV standard for the domain of human brain imaging. NIDM provides a language to communicate provenance by representing primary data, computational workflow, and derived data as bundles of linked Agents, Activities, and Entities. Similar to the way a sentence conveys a standalone thought, a bundle contains provenance statements that parsimoniously express the way a given piece of data was produced. To demonstrate a system that implements NIDM, I developed a modern, semantic Web application platform that provides neuroimaging workflow as a service and captures provenance statements as NIDM bundles. The course of this work necessitated interaction with an international community, which adopted and extended central elements of this work into prevailing brain imaging software. My dissertation contributes neuroinformatics standards to advance the current state of computational infrastructure available to the cognitive neuroimaging community.
Presentation by Prof. Dr. Henning Müller.
Overview:
- Medical image retrieval projects
- Image analysis and 3D texture modeling
- Data science evaluation infrastructures (ImageCLEF, VISCERAL, EaaS – Evaluation as a Service)
- What comes next?
Presentation of the article at Workshop of Learning Analytics & Knowledge 2016 in April 25, 2016.
Note: full paper is available on http://www.laceproject.eu/wp-content/uploads/2015/12/ep4la2016_paper_4.pdf
Towards a Project Centric Metadata Model and Lifecycle for Ontology Mapping G...Christophe Debruyne
Christophe Debruyne, Brian Walshe, Declan O'Sullivan: Towards a Project Centric Metadata Model and Lifecycle for Ontology Mapping Governance. Paper presented at iiWAS 2015 on the 13th of December 2015, Brussels, Belgium.
study or concern about what kinds of things exist
what entities there are in the universe.
the ontology derives from the Greek onto (being) and logia (written or spoken). It is a branch of metaphysics , the study of first principles or the root of things.
Recruitment Based On Ontology with Enhanced Security Featurestheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Reference Domain Ontologies and Large Medical Language Models.pptxChimezie Ogbuji
Large Language Models (LLMs) have exploded into the modern research and development consciousness and triggered an artificial intelligence revolution. They are well-positioned to have a major impact on Medical Informatics. However, much of the data used to train these revolutionary models are general-purpose and, in some cases, synthetically generated from LLMs. Ontologies are a shared and agreed-upon conceptualization of a domain and facilitate computational reasoning. They have become important tools in biomedicine, supporting critical aspects of healthcare and biomedical research, and are integral to science. In this talk, we will delve into ontologies, their representational and reasoning power, and how terminology systems such as SNOMED-CT, an international master terminology providing comprehensive coverage of the entire domain of medicine, can be used with Controlled Natural Languages (CNL) to advance how LLMs are used and trained.
Implementation of a Knowledge Management Methodology based on Ontologies :Cas...rahulmonikasharma
in this paper, we suggest a methodology of knowledge management that makes use of the new possibilities offered by semantic web technologies and covers the various stages of the project life cycle. In fact, with this new vision of ontologies and semantic web, it is important to provide a strong methodological support in order to develop complex ontology-based systems.
Information overload for communities of practiceMurray Turoff
A study of emergency management professionals with emphasis on medical and public health done for NLM. These are slides of a paper presented at Web2008 during ICIS 2008 and you can request a copy of the paper from me directly as well as other work in this area. Check my website for the full NLM report
Social Tags and Linked Data for Ontology Development: A Case Study in the Fin...Andres Garcia-Silva
We describe a domain ontology development approach that extracts domain terms from folksonomies and drive the search for classes and relationships in the Linked Open Data cloud. As a result, we obtain lightweight domain ontologies that combine the emergent knowledge of social tagging systems with formal knowledge from Ontologies. In order to illustrate the feasibility of our approach, we have produced an ontology in the financial domain from tags available in Delicious, using DBpedia, OpenCyc and UMBEL as additional knowledge sources.
Ontology Evaluation Methods and Metrics - This is work I did while I was at The MITRE Corporation. I came up with a framework to support ontology evaluation for reuse that could also be used for ontology construction. I was the sole author of the approach, which was intended to begin a research program and a community of practice around it. It's been on hold and would like that to change. I'm now at the Tetherless World Constellation at Rensselaer Polytechnic Institute, if interested contact me there.
Understanding Crises: Investigating Organizational Safety Culture by Combinin...streamspotter
David Passenier, Colin Mols, Jan Bím, and Alexei Sharpanskykh on "Understanding Crises: Investigating Organizational Safety Culture by Combining Organizational Ethnography and Agent Modeling" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
ASC Model: A Process Model for the Evaluation of Simulated Field Exercises in...streamspotter
Alayne da Costa Duarte, Marcos Roberto da Silva Borges, José Orlando Gomes, and Paulo Victor R. de Carvalho on "ASC Model: A Process Model for the Evaluation of Simulated Field Exercises in the Emergency Domain" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Comparing Performance and Situation Awareness in USAR Unit Tasks in a virtual...streamspotter
Nanja Smets, Corine Horsch, Mark Neerincx and Raymond Cuijpers on "Comparing Performance and Situation Awareness in USAR Unit Tasks in a virtual and real Environment" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Towards a Model-Based Analysis of Place-Related Information in Disaster Respo...streamspotter
Stefan Sackmann, Marlen Hofmann, and Hans Betke on "Towards a Model-Based Analysis of Place-Related Information in Disaster Response Workflows" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Information Infrastructure for Crisis Response Coordination: A Study of local...streamspotter
Torbjørg Meum and Bjørn Erik Munkvold on "Information Infrastructure for Crisis Response Coordination: A Study of local Emergency Management in Norwegian Municipalities" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Validating Procedural Knowledge in the Open Virtual Collaboration Environmentstreamspotter
Gerhard Wickler on "Validating Procedural Knowledge in the Open Virtual Collaboration Environment" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Context-Based Knowledge Fusion Patterns in Decision Support System for Emerge...streamspotter
Alexander Smirnov, Tatiana Levashova, and Nikolay Shilov
on "Context-Based Knowledge Fusion Patterns in Decision Support System for Emergency Response" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Exploring Shared Situational Awareness using Serious Gaming in Supply Chain D...streamspotter
Shalini Kurapati, Gwendolyn Kolfschoten, Thomas M. Corsi, and Frances Brazier on "Exploring Shared Situational Awareness using Serious Gaming in Supply Chain Disruptions" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
LVC Training Environment for Strategic and Tactical Emergency Operationsstreamspotter
Laura Ardila, Israel Perez-Llopis, Carlos Palau, and Manuel Esteve on "LVC Training Environment for Strategic and Tactical Emergency Operations" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Ethical Challenges of Participatory Sensing for Crisis Information Management streamspotter
Massimiliano Tarquini and Maurizio Morgano
on "Ethical Challenges of Participatory Sensing for Crisis Information Management " at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
The Impact of IT on the Management of Mass Casualty Incidents in Germanystreamspotter
Nils Ellebrecht, Konrad Feldmeier, and Stefan Kaufmann on "The Impact of IT on the Management of Mass Casualty Incidents in Germany" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Towards a Knowledge-Intensive Serious Game for Training Emergency Medical Ser...streamspotter
Nour EL MAWAS, and Jean-Pierre CAHIER on "Towards a Knowledge-Intensive Serious Game for Training Emergency Medical Services" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Communication Interface for Virtual Training of Crisis Managementstreamspotter
Jan Rudinsky, and Ebba Thora Hvannberg on "Communication Interface for Virtual Training of Crisis Management" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Optimization Modeling and Decision Support for Wireless Infrastructure Deploy...streamspotter
Michael R. Bartolacci, Albena Mihovska, and Dilek Ozceylan on "Optimization Modeling and Decision Support for Wireless Infrastructure Deployment in Disaster Planning and Management" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
A System Dynamics Model of the 2005 Hatlestad Slide Emergency Management streamspotter
Jose J Gonzalez, Geir Bøe, and John Einar Johansen on "A System Dynamics Model of the 2005 Hatlestad Slide Emergency Management" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Inter-organizational Collaboration Structures during Emergency Response: A Ca...streamspotter
Aslak Wegner Eide, Ida Maria Haugstveit, Ragnhild Halvorsrud, and Maria Borén on "Inter-organizational Collaboration Structures during Emergency Response: A Case Study " at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Unexpected Effects of Rescue Robots’ Team-Membership in a virtual Environmentstreamspotter
Corine Horsch, Nanja Smets, Mark Neerincx, and Raymond Cuijpers on the "Unexpected Effects of Rescue Robots’ Team-Membership in a virtual Environment" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
A Typology to facilitate Multi-Agency Coordinationstreamspotter
Steven Curnin and Christine Owen on "A Typology to facilitate Multi-Agency Coordination" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Exercises for Crisis Management Training in intra-organizational Settingsstreamspotter
Lena-Maria Öberg and Erik A.M Borglund on "Exercises for Crisis Management Training in intra-organizational Settings" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
A Novel Architecture for Disaster Response Workflow Management Systemsstreamspotter
Marlen Hofmann, Hans Betke, and Stefan Sackmann on "A Novel Architecture for Disaster Response Workflow Management Systems" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4
Ontologies for Crisis Management: A Review of State of the Art in Ontology Design and Usability
1. Ontologies for Crisis Management: A Review
of State of the Art in Ontology Design and
Usability
Shuangyan Liu
Aston University, Birmingham, UK
s.liu10@aston.ac.uk
WITH THE FINANCIAL SUPPORT OF THE PREVENTION, PREPAREDNESS, AND CONSEQUENCE MANAGEMENT OF TERRORISM
AND OTHER SECURITY-RELATED RISKS PROGRAMME. EUROPEAN COMMISSION - DIRECTORATE-GENERAL HOME AFFAIRS
2. Outline
• What is Semantic Technologies?
• What is an Ontology?
• Why is Semantic Technologies related?
• Existing Ontologies for Disaster
Management
• Conclusion and Future Work
2
3. What is Semantic Technologies?
3
Semantic Web Technology Stack (Steve Bratt, 2006)
Assigning
unambiguous name
for something
Basic data model
OWL is an ontology
language with rich
expressiveness
Still
experimental
5. What is an Ontology?
Tim Berners-Lee:
“ An ontology is a document or file that formally defines the relations
among terms.”
OWL – a formal ontology language, and it provides standard labels for
describing terms.
o Classes (owl:class, owl:unionOf etc.)
o Properties (owl:ObjectProperty, owl:DatatypeProperty, rdfs:domain, rdfs:range etc.)
o Property restriction (owl:allValuesFrom, owl:cardinality etc.)
o Relations (owl:equivalentClass, rdfs:subClassOf, owl:equivalentProperty etc.)
o Characteristics of properties (e.g. owl:SymmetricProperty)
o Datatypes (e.g. rdfs:Literal)
o ... and more
A domain ontology provides a shared understanding of the domain.
Querying and reasoning using an ontology can help reveal implicit concepts
and relationships that may not readily apparent.
5
6. Why is Semantic Technologies/Ontologies related?
• To promote semantic interoperability
• To provide a specified reference or a common
language that can be used to specify disaster-
related things
• To enable integration of crisis information
systems and data
• To add semantics to web services descriptions
which enables automatic service discovery
6
7. A Survey of Ontologies for Disaster Management
• Research Motivation
• Research Methodology
– Research Questions
– Data Collection Method
– Data Analysis Method
• Results
– Coverage of Ontologies
– Design of Ontologies
– Use Cases of Ontologies
7
8. Research Motivation
• The semantic interoperability challenge (Fan &
Zlatanova, 2011; W3C EIIF 2009)
• Prerequisite - to establish shared vocabularies
• Lack of a common vocabulary in disaster management
• No overview of the information
• Aim
– identify the areas of concepts represented in crisis
information management systems, and
– existing ontologies that cover these concepts.
8
9. Research Questions
• What subject areas do the concepts used in
disaster management belong to?
• What are the existing ontologies that cover
these subject areas?
• How are the existing ontologies for disaster
management designed and used?
9
11. Methodology
Data Analysis Sub-questions
– QI-1: What subject areas describe the range of concepts involved in crisis and
disaster management?
– QII-1: What ontologies exist that cover each subject area?
– QII-2: Does an individual ontology include concepts in one subject area or in
multiple subject areas?
– QII-3: Is the ontology represented formally? If yes, what language is used to
describe the ontology?
– QII-4: Is the ontology publicly accessible e.g. downloadable from a website?
– QIII-1: What is the purpose of the ontology e.g. the type of crisis management
system it is aimed for?
– QIII-2: How many concepts or terms are defined in the ontology?
– QIII-3: What categories of concepts are defined in the ontology e.g. classes, object
properties and/or data properties?
– QIII-4: What is the approach or principle used to design the ontology?
– QIII-5: Is there a use case that demonstrates the functionalities of the ontology?
11
Data Analysis Method
13. Results
Subject Area
Number of Ontologies
Identified
Ontology Name
Representation
Language
Downloadable Documentation
Resources 3 SOKNOS OWL-DL No
Minimal (academic
nature)
MOAC RDF Yes Online specification
SIADEX Not known No
Minimal (academic
nature)
Processes 2 ISyCri OWL-DL No
Minimal (private wiki
and in French)
WB-OS XML Available upon request Academic nature
People 2 FOAF RDF Yes Online specification
BIO RDF Yes Online specification
Organisations 3 ERO2M N/A No Academic nature
IntelLEO RDF Yes Online specification
Organisation Ontology RDF Yes Online specification
13
Existing Ontologies
14. Results
Subject Area
Number of Ontologies
Identified
Ontology Name
Representation
Language
Downloadable Documentation
Damage 1 HXL RDF Yes Online specification
Disasters 4 EM-DAT N/A Online query
Classification of
disasters available
UNEP-DTIE N/A Online query Online documentation
Canadian Disaster
Database
N/A Online query
Classification of
disasters available
Australian Government
Attorney-General’s
Department Disasters
Database
N/A Online query Online documentation
Infrastructure 3 PSCAD N/A No
Minimal (academic
nature)
EPANET N/A No
Minimal (academic
nature)
OTN OWL Yes Specification available
Geography 1 GeoNames RDF Yes Online documentation
14
Existing Ontologies (Cont.)
15. Results
Subject Area
Number of Ontologies
Identified
Ontology Name
Representation
Language
Downloadable Documentation
Hydrology 1
Ordnance Survey
Hydrology Ontology
OWL Yes Online documentation
Meteorology 1
NNEW weather
ontology
OWL Yes Online documentation
Topography 4 USGS CEGIS OWL Yes Not available
Ordnance Survey
Buildings and Places
Ontology
OWL Yes Online documentation
E-response Building
Pathology Ontology
OWL Yes Not available
E-response Building
Internal Layout
Ontology
OWL Yes Not available
Other 1
AktiveSA (multi-
domain)
OWL Yes Not available
15
Existing Ontologies (Cont.)
18. Results
18
Purpose of Ontologies
Purposes
General
Disaster Archive
Humanitarian
Response & Relief
Infrastructure System
Simulators
Decision Support
Response
Coordination
Resource
Management
Disaster Management
Guidance Website
Situation Awareness
19. Results
19
Coverage of Ontologies
• 65% lightweight
• average size: 119
concepts
• 35% large – large
number of instances
• 65% contains four types
of concepts
21. Results
• Some important subject areas are not
fully addressed e.g.
damage, people, processes
• No formally represented ontology for
describing disaster events
• Lack of links between subject areas
21
Gaps
23. Conclusion
• As a result of the review, we identified a set of critical subject
areas that cover the information concepts dealt with in crisis
management and the currently existing ontologies that
represent these subject areas.
• All of the identified 11 subject areas are covered by existing
ontologies and 65% of the existing ontologies are semantically
interoperable.
• This review provides an overall picture of the subject areas and
how they are represented and used in crisis management
systems.
• It provides a basis for identifying the missing vocabularies and
for constructing a new framework of ontologies for emergency
and disaster management.
23
24. Where Are We Now?
• Proposed a semantic framework of
emergency and disaster management
– http://www.disaster20.eu/smerst-2013/wp-
content/uploads/2013/05/shuangyan_presentation_SMERST2013.
pdf
• Developed an ontology (Cihai) for
emergency and disaster response
information interoperability
• Developed a use case of using Cihai to
structure earthquake data from GDACS
website (presentation on )
– http://youtu.be/ZzrxYn_s2A0
24
25. Future Work
• Publish the Cihai ontology online
(presence, feedback, improvement)
• Develop use cases
(feedback, improvement, applications)
• Collaboration (W3C Emergency
Information Community Group)
25
26. The End
Thank you!
Disaster 2.0 Project
Semantic Technologies for Disaster Management
Shuangyan Liu
s.liu10@aston.ac.uk
26
Editor's Notes
This is part of my research for the Disaster 2.0 projectBackground on the project, we want to explore how SW tech are currently and can be potentially used for disaster management.
Semantic technologies are the tech created in the development of the Semantic Web [2][3].Most of the Web's content today is designed for humans to read, not for computer programs to manipulate meaningfully. Computers can adeptly parse Web pages for layout and routine processing—here a header, there a link to another page—but in general, computers have no reliable way to process the semantics: this is the home page of the Hartman and Strauss Physio Clinic, this link goes to Dr. Hartman's curriculum vitae. The Semantic Web will bring structure to the meaningful content of Web pages, creating an environment where software agents roaming from page to page can readily carry out sophisticated tasks for users. Instead these semantics were encoded into the Web page,extension of the current one, The essential property of the World Wide Web is its universality. The power of a hypertext link is that "anything can link to anything. difference between information produced primarily for human consumption and that produced mainly for machines. the Web has developed most rapidly as a medium of documents for people rather than for data and information that can be processed automatically. The Semantic Web aims to make up for this. The Semantic Web aims to make up for this. The diagram represents the layer cake of SW, which describes the main layers of the SW design and vision. At the bottom we find XML which a language letting one write structured web documents with a user-defined vocabulary. RDF is a basic data model, like ER model for writing simple statements about web resources. RDF Schema provides modelling primitives for organising web objects into hierarchies. RDFS is based on RDF. RDFS can be viewed as a primitive language for writing ontologies. OWL is a more powerful ontology language that expand RDF Schema and allow the representations of more complex relationships between objects.The logical layer is used to enhance the ontology language further and to allow the writing of application-specific declarative knowledge (in declarative sentences or indicative propositions). The proof layer involves the actual deductive process and the representation of proofs in web languages and proof validation. The Trust layer will emerge through the use of digital signatures and other kinds of knowledge (recommendations by trusted agents or on rating and certification agencies and consumer bodies.Other technologies like trust management are still work in progress.!! XML allows users to add arbitrary structure to their documents but says nothing about what the structures mean.
Let’s have a look at a flooding scenario first, which shows the typical type of problem that semantic technologies are applied to.Flooding is a common disaster in UK. The recent flooding happened across the country last Nov. More than 800 homes have been flooded after storms hit parts of England and Wales. A number of homes in Kempsey, Worcestershire had to be evacuated after the storm. Some people might need extra support in a flood or emergency. Normally, the local authorities maintain the lists of vulnerable people in the area. Imagine the Worcestershire county council and Kempsey police both have a database that contains information about vulnerable people. However, the two databases may use different table identifiers for what is in fact the same concept. A program that wants to combine information across the two databases has to know that these two terms are being used to mean the same thing. Therefore, the program must have a way to discover such common meanings for whatever db it encounters.A solution to this problem is provided by the basic component of the Semantic Web called ontologies.
A brief introduction to ontology. Originates from philosophy. What we mean is commonly used by the AI and Semantic Web community.OWL2 is a language for describing sets of things. These sets are called ‘classes’. Any statement we make about a class in OWL2 is used to differentiate that class from the set of all things.We use these labels to describe the terms for a domain. OWL have labels for defining classes and properties. It also has labels to define property restriction such as value constraints and cardinality constraints (e.g. a computer has only one motherboard.) It can define relations between classes and properties. And many other labels.(RDF – a basic data model for the semantic web, The expressive power of RDF and RDFS is very limited in some areas. Web Ontology Language (OWL) is an ontology language that provides richer expressiveness than RDF and RDF Schema. It adopts the RDFS meaning of classes and properties (rdfs:Class, rdfs:subClassOf, etc.) and adds language primitives to support the richer expressiveness required. )If go back to the flooding scenario, imagine an ontology is defined for the domain, two classes ‘VulerableResidents’ and ‘DisadvantagedGroup’ can be defined as equivalent classes. A program that wants to retrieve the related data from two different db can know they have the same meaning and thus can combine the data.
At fundamental level, different groups can have fundamentally different conceptualisations of disasters and disaster management and might use very different terminologies, which prevents the integration of disaster data from different sources.serve as a specified reference to be used by personnel in different organisation, thus constitute common language to be spoken by the different organisationsserve as a standard, facilitateintegration of different crisis information systems at user interfaces and data levelmatch service requests and offers for discovering the most appropriate offer for a given request.
At fundamental level, different groups can have fundamentally different conceptualisations of disasters and disaster management and might use very different terminologies
Consequently, we aim to answer the following research questions:…Gaps ?What standards, if any, do the existing ontologies relevant to crisis management conform to?
Data Collection Methodssearched databases and forums relevant to disaster management, information systems and semantic web selected nineteen papers to include in our review (for the full list of papers see [5])seeked for the keywords highlighting the concepts presented in the papers, and added to a list of subject areassearched the papers and the Web to identify relevant ontologiesThe number of relevant ontologies collected was 26. DriveQuestion 1: how do u generate the list of 11 subject areas? E.g. how to decide Shelter, which area? By common sense and by referring to dictionaries if not sure where to put. http://tagcrowd.com/
We have identified 11 subject areas that concepts used in disaster management belong to.The subject areas identified show two groups of concepts involved in crisis information systems: common concepts (people, organisations, resources, disasters, geography, processes, infrastructure, damage) and unusual concepts (topography, hydrology and meteorology).Subclasses of each area as examples to illustrate the concept.
A total of 26 ontologies identified that over the 11 subject areas.In the following tables, the number of the ontologies identified for each area, their names, the representation languages, theiraccessibility and documentation are illustrated.(here not mention - Point 1: Accessibility and representation - These tables show that among the ontologies designed originally for crisis management, very few (e.g. MOAC, HXL) are formally represented and publicly accessible. Point 2: Missing areas - For the disaster and infrastructure areas, no formally represented ontologies were found. For processes area, no public available ontologies found.)To mention: Point 3: Coverage of concepts- 20 out of 26 ontologies describe concepts in a single subject area.A few represent multiple subject areas. In the table, an ontology is assigned to the subject area where the main purpose of the ontology lies.
Point out the patternsLower number are areas provides unusual concepts e.g. hydrology, meteorologySome important common conceptual areas are not fully addressed e.g. damage, people, processesOn the top of the list, although the biggest number of ontologies for disasters are found. If you take a look at the form, they are not formally represented, as in database scheme.Other top number of ontologies, they are not public available such as resources ontologies
Analyse ‘No’ scenarioNot formal: mainly are the common areasNot public: mainly are the common areasNot well documented: mainly the common areasformal and public: half; not formal and public are mainly the common areasOnly few are formal, public and well documented
General includes people, org, transport system, geography, and so on
Patterns11 out of the 17 ontologies (i.e. 65% of the countable ones) are lightweight (containing totally less than 300 concepts, average size: 119 concepts).Six ontologies (35% of the countable ones) are relatively large (containing over 500 concepts, average size: 1297 concepts).Some contain large number of instances, e.g. SIADEX, GeoNames.
Analysis of ontology design conducted for each subject area (totally 11 subject areas)Focusing on principles for structuring concepts, what concepts are represented in each ontology, and what types of crisis information systems they are aimed forTaxonomy (Damage, disaster)Specialisation (Damage)Hierarchical structure (All as Resources)Properties (People, Insfrastructure, Geo, Hydrology, Weather)Relationships (Processes, Geo)Upper level ontology (Organisation, SUMO, domain independent)Reverse-engineering approach (Topography, design from data already existing)Scenario-based approach (E-response Building pathology and layout ontology)The advantage of conforming to an upper level ontology lies in the ability to aligning the model to a set of common and cross-domain notions and thus can reduce the heterogeneity in domain specific ontologies.The reverse-engineering approach refers to the geospatial databases of a national map for constructing the topography ontology. Scenario-based approach: when the system does not exist, you are going to build an ontology for the system.===feedback===Show examples from the identified ontologies, mainly diagramsMissing areas
Damage – affected population and places, are there other types of things are affected such as affected infrastructure such as affected airport aspect of damage, causesPeople – foaf general features, no specialisation of person type, shall we care about the variety of people involved in disaster response?Processes – related to actors, their resources, the services they provide and their procedures,No publicly available ontology found !! For a response planning system, we may care about who is going to do what type of response task and the procedures to complete the task (in case don’t know how – miseiphone app) However, Do the information need to be recorded during or after the disaster?Disaster – classification of disasters, other properties as start time, boundaryOpen discussion
Drive Question 2: end-users of the research (Other Work will cover, show detail examples as linked data science) NGO to refer to the ontology to create the databases
Use Case – Cihai at H4D2:Cihai Ontology Project at the hackathon H4D2Unstructured Earthquake Data from GDACS websiteStructured Data in RDF using Cihai OntologyFuseki SPARQL data repositoryCihai SPARQL endpointMake queries to the SPARQL endpoint