This document discusses standards and protocols that enable interoperability for connecting devices to the internet of things. It notes that while there are many choices for communication protocols, interoperability requires agreement on information exchange standards. The talk will cover existing information exchange standards and technologies that can be used with the standards. It emphasizes that every API or protocol standard should have an accompanying information model that defines relevant concepts to provide necessary context and metadata. Semantic web technologies like ontologies, RDF, and SPARQL can help integrate data from different sources by mapping terms and relationships between standards. This approach supports interoperability across distributed, loosely coupled smart city systems.
Los medios educativos digitales se definen como materiales compuestos por medios digitales e interactivos producidos para facilitar el aprendizaje, que aprovechan las prestaciones multimedia para superar formatos analógicos, deben ser accesibles en tres niveles y proporcionan una base interactiva para el desarrollo del aprendizaje.
RDF Schema provides the framework to describe application-specific classes and properties.
RDF Schema ‘semantically extends’ RDF to enable us to talk about classes of resources, and the properties that will be used with them.
Classes in RDF Schema is much like classes in object oriented programming languages. This allows resources to be defined as instances of classes, and subclasses of classes.
RDF schemas are Web resources (and have URIs) and can be described using RDF
El documento describe los diferentes tipos de medios educativos, incluyendo medios visuales como material impreso, pizarrón y diapositivas; medios auditivos como la palabra hablada, radio y grabaciones; medios audiovisuales como video, televisión y presentaciones; e informáticos y telemáticos como multimedia, video interactivo, internet, correo electrónico y chat. Define las características técnicas y pedagógicas de cada medio y ofrece ejemplos específicos.
El documento describe los diferentes tipos de compuestos inorgánicos y sus respectivas nomenclaturas. Explica que los compuestos inorgánicos pueden ser binarios, ternarios o cuaternarios, dependiendo del número de elementos que los componen. Además, detalla la nomenclatura de óxidos, hidruros, sales binarias, peróxidos e hidrácidos.
Diferencia entre recurso y material didácticonoemir4
Un recurso didáctico es cualquier material de apoyo que un maestro utiliza para facilitar el desarrollo de actividades en el salón de clases, mientras que los materiales didácticos son recursos elaborados como videos, imágenes y música que se usan para facilitar los procesos de enseñanza. Un pizarrón interactivo permite manipular aplicaciones de computadora tocando su superficie y hacer anotaciones digitales que se pueden guardar, imprimir o compartir. Los estudiantes se motivan más en clases con pizarrón interactivo debido
Este documento describe los medios didácticos y sus funciones. Define los medios didácticos como instrumentos que ayudan a los formadores a enseñar y facilitan que los estudiantes logren los objetivos de aprendizaje, como pizarras, proyectores y ordenadores. Explica que las principales funciones de los medios didácticos son facilitar el proceso de enseñanza-aprendizaje y motivar a los estudiantes al acercarlos a realidades inaccesibles. También describe diferentes tipos de medios tradicionales y audiovisuales
Este documento define materiales y recursos educativos y discute su importancia y características. Los materiales educativos son productos diseñados para apoyar el aprendizaje y la enseñanza, mientras que los recursos educativos son cualquier material utilizado en el contexto educativo. Es crucial seleccionar materiales que se ajusten a las necesidades, habilidades y contexto de los estudiantes para garantizar su eficacia.
Este documento presenta el silabo de la asignatura "Taller de Medios y Materiales Educativos" que forma parte del programa de Licenciatura en Educación de la Universidad Nacional "Pedro Ruiz Gallo". El silabo describe los objetivos del curso, que son analizar los fundamentos teóricos de los medios y materiales educativos, demostrar compromiso en su diseño y uso, y producir materiales usando recursos reciclables. El programa se divide en tres unidades sobre medios educativos, materiales educativos, y diseño de materiales para cada especialidad. La metod
Los medios educativos digitales se definen como materiales compuestos por medios digitales e interactivos producidos para facilitar el aprendizaje, que aprovechan las prestaciones multimedia para superar formatos analógicos, deben ser accesibles en tres niveles y proporcionan una base interactiva para el desarrollo del aprendizaje.
RDF Schema provides the framework to describe application-specific classes and properties.
RDF Schema ‘semantically extends’ RDF to enable us to talk about classes of resources, and the properties that will be used with them.
Classes in RDF Schema is much like classes in object oriented programming languages. This allows resources to be defined as instances of classes, and subclasses of classes.
RDF schemas are Web resources (and have URIs) and can be described using RDF
El documento describe los diferentes tipos de medios educativos, incluyendo medios visuales como material impreso, pizarrón y diapositivas; medios auditivos como la palabra hablada, radio y grabaciones; medios audiovisuales como video, televisión y presentaciones; e informáticos y telemáticos como multimedia, video interactivo, internet, correo electrónico y chat. Define las características técnicas y pedagógicas de cada medio y ofrece ejemplos específicos.
El documento describe los diferentes tipos de compuestos inorgánicos y sus respectivas nomenclaturas. Explica que los compuestos inorgánicos pueden ser binarios, ternarios o cuaternarios, dependiendo del número de elementos que los componen. Además, detalla la nomenclatura de óxidos, hidruros, sales binarias, peróxidos e hidrácidos.
Diferencia entre recurso y material didácticonoemir4
Un recurso didáctico es cualquier material de apoyo que un maestro utiliza para facilitar el desarrollo de actividades en el salón de clases, mientras que los materiales didácticos son recursos elaborados como videos, imágenes y música que se usan para facilitar los procesos de enseñanza. Un pizarrón interactivo permite manipular aplicaciones de computadora tocando su superficie y hacer anotaciones digitales que se pueden guardar, imprimir o compartir. Los estudiantes se motivan más en clases con pizarrón interactivo debido
Este documento describe los medios didácticos y sus funciones. Define los medios didácticos como instrumentos que ayudan a los formadores a enseñar y facilitan que los estudiantes logren los objetivos de aprendizaje, como pizarras, proyectores y ordenadores. Explica que las principales funciones de los medios didácticos son facilitar el proceso de enseñanza-aprendizaje y motivar a los estudiantes al acercarlos a realidades inaccesibles. También describe diferentes tipos de medios tradicionales y audiovisuales
Este documento define materiales y recursos educativos y discute su importancia y características. Los materiales educativos son productos diseñados para apoyar el aprendizaje y la enseñanza, mientras que los recursos educativos son cualquier material utilizado en el contexto educativo. Es crucial seleccionar materiales que se ajusten a las necesidades, habilidades y contexto de los estudiantes para garantizar su eficacia.
Este documento presenta el silabo de la asignatura "Taller de Medios y Materiales Educativos" que forma parte del programa de Licenciatura en Educación de la Universidad Nacional "Pedro Ruiz Gallo". El silabo describe los objetivos del curso, que son analizar los fundamentos teóricos de los medios y materiales educativos, demostrar compromiso en su diseño y uso, y producir materiales usando recursos reciclables. El programa se divide en tres unidades sobre medios educativos, materiales educativos, y diseño de materiales para cada especialidad. La metod
Smart Cities that don't go "bump" in the night: delivering interoperable smar...Rick Robinson
This document discusses concepts related to smart cities and their information modeling. It begins by defining a smart city and outlining some of its key components. It then provides examples of concepts that could be included in an information model for city systems, such as organizations, alerts, incidents, assets, and locations. It also discusses existing standards that could be leveraged for modeling these concepts and provides examples of their current use. Finally, it presents an approach for developing a semantic model called SCRIBE that is aligned with standards, customizable for different city needs, and extensible.
Semantic technologies for the Internet of Things PayamBarnaghi
The document discusses semantic technologies for the Internet of Things. It describes how sensor data in the IoT is time-dependent, continuous, and variable quality. Semantic annotations and machine-interpretable formats like XML and RDF are needed to make the data interoperable. Ontologies provide formal definitions of concepts and relationships in a domain that enable machines to process IoT data and enable autonomous device interactions. The document outlines approaches to semantically describe sensor observations and measurements using XML, RDF graphs, and adding domain concepts and logical rules with ontologies.
Cities are composed of complex systems with physical, cyber, and social components. Current works on extracting and understanding city events mainly rely on technology enabled infrastructure to observe and record events. In this work, we propose an approach to leverage citizen observations of various city systems and services such as traffic, public transport, water supply, weather, sewage, and public safety as a source of city events. We investigate the feasibility of using such textual streams for extracting city events from annotated text. We formalize the problem of annotating social streams such as microblogs as a sequence labeling problem. We present a novel training data creation process for training sequence labeling models. Our automatic training data creation process utilizes instance level domain knowledge (e.g., locations in a city, possible event terms). We compare this automated annotation process to a state-of-the-art tool that needs manually created training data and show that it has comparable performance in annotation tasks. An aggregation algorithm is then presented for event extraction from annotated text. We carry out a comprehensive evaluation of the event annotation and event extraction on a real-world dataset consisting of event reports and tweets collected over four months from San Francisco Bay Area. The evaluation results are promising and provide insights into the utility of social stream for extracting city events.
The document discusses master data management in the city of Gothenburg. It describes challenges around managing structured and unstructured data from multiple systems. It then demonstrates a pilot project that uses semantic standards, models and open vocabularies to link and query master data. Entities like people, organizations, locations and services are modeled. The pilot shows how data can be accessed through a SPARQL endpoint or API to enable interoperability.
The document discusses integrating sensor and social data to understand city events. It describes collecting data from multiple sources, including sensors and social media. Statistical models are used to analyze the sensor data and identify anomalies, which are then correlated with events extracted from social media using spatial and temporal proximity. The approach is evaluated on traffic data from San Francisco, integrating data from traffic sensors and Twitter to extract and corroborate traffic events.
This presentation covers the requirements to get started with HunchLab 2.0's predictive policing system. It starts discussing technical requirements (security, authentication) and then proceeds to discuss guidelines for configuring meaningful predictive models of crime. The presentation concludes with information about related geographic and temporal datasets that are useful in forecasting crime with recommendations on how to prioritize data sets to use in HunchLab.
How to Achieve Cross-Industry Semantic InteroperabilityDoug Migliori
The document discusses achieving cross-industry semantic interoperability through developing a common information model and ontologies. It proposes a "blended" approach that combines concepts from different standards organizations to minimize semantic disparity across industries. This would involve a top-level ontology, an ontology for an information model, and a system ontology to define relationships between business and device systems and support use cases across multiple industries.
The document discusses the concept of service-oriented science and describes several key aspects:
1) People can create services (data, code, instruments), discover and decide whether to use existing services, and compose services to create new functions.
2) Services are hosted by "someone else" so individuals do not need expertise in operating services and computers. It is hoped this entity can manage security, reliability, and scalability.
3) Discovery, composition, publishing, and hosting of services are important aspects that enable service-oriented science. Standards, registries, tagging, and social networks help with discovery, while workflows, containers, and dynamic provisioning support composition and hosting at scale.
Emerging Dynamic TUW-ASE Summer 2015 - Distributed Systems and Challenges for...Hong-Linh Truong
This is a lecture from the advanced service engineering course from the Vienna University of Technology. See http://dsg.tuwien.ac.at/teaching/courses/ase/
NIEM is a data exchange model used across federal, state, local, and tribal governments. Business Architecture Committee (NBAC) June 2009 through November 2013. OmniClass facility types built in to advance public safety.
Why the Brick Schema is a Game Changer for Smart Buildings?Memoori
Why is the Brick Schema a Game Changer for Smart Buildings? Memoori is joined by Founder Members of the Brick Open Schema Project: Gabe Fierro, University of California, Berkley, Jason Koh, University of California, San Diego; and Yiyi Guo from Johnson Controls. Learn about how the Brick Schema can Help Overcome Challenges in Building IoT Data Management.
Accounting Value Effects for Responsible NetworkingGiovanni Sileno
The document discusses approaches to achieving a "responsible internet" through redistributing control and monitoring abilities to users. It identifies two issues with the existing "Responsible Internet" proposal:
1) A "responsibility gap" exists because low-level programmability for users does not address their ability to foresee outcomes or assess actions according to values.
2) The power relationships between users, network operators, and governance bodies are not partially programmable, requiring models of how the world functions and what is valuable, as well as mechanisms for consolidating norms from multiple sources into policies at different levels.
The document discusses using web services for air quality management by accessing distributed data sources and processing the data through filtering, aggregation and fusion. It proposes a service-oriented architecture called DataFed.Net to access, process and deliver air quality information. As an example, it describes how web services could be used to monitor and predict the impact of smoke from fires on particulate matter concentrations.
Distributed Systems: How to connect your real-time applicationsJaime Martin Losa
This document provides an overview of distributed systems and how to connect real-time applications using the Data Distribution Service (DDS) standard. It introduces DDS and its architecture, including topics, instances, keys, quality of service policies. It then demonstrates how to create a basic "hello world" publisher/subscriber example in both eProsima Fast RTPS and RTI Connext DDS middleware in 3 steps: defining the data type, generating code, and building/running the publisher and subscriber.
This document contains information about a course on Information Technology for Managers at IQRA University. The course code is I.T. For Managers. The facilitator is Qazi Shahab Azam who has an MBA. The semester is the 2nd semester. The student name provided is Syed Zamin Ali Shah with student number 8914. The assignment is the mandatory assignment number 2 dated May 6th, 2014.
oneM2M - Management, Abstraction and SemanticsoneM2M
The document discusses concepts related to management, abstraction, and semantics in oneM2M including:
- Management provides unified APIs for configuring, monitoring, and managing devices, applications, and service entities.
- Abstraction hides the complexity of specific technologies by providing a single, unified information model and methods for applications.
- Semantics adds meaning and relationships between concepts to enable machine understandable interoperability.
- oneM2M provides resource models and protocols for management, and attributes for basic semantic annotation. Interworking proxies map non-oneM2M models to common oneM2M resources.
The document provides an overview of the Lightweight Directory Access Protocol (LDAP). It discusses how LDAP defines a standard method for accessing and updating directory information. Key points include:
- LDAP is optimized for read access of information that needs to be accessed from multiple locations and referenced by many applications. It is not well-suited for rapidly changing information.
- The core LDAP protocol defines operations like search, compare, add, delete, and modify to access and update directory entries stored on an LDAP server.
- Directories store typed and ordered information about objects in a hierarchical tree structure defined by distinguished names.
- Popular LDAP server software includes Microsoft Active Directory, Oracle Internet Directory, and Apache Directory Server.
Overview of schedule and cost risk analysis methodology for aerospace industry.
For more information how to perform schedule risk analysis using RiskyProject software please visit Intaver Institute web site: http://www.intaver.com.
About Intaver Institute.
Intaver Institute Inc. develops project risk management and project risk analysis software. Intaver's flagship product is RiskyProject: project risk management software. RiskyProject integrates with Microsoft Project, Oracle Primavera, other project management software or can run standalone. RiskyProject comes in three configurations: RiskyProject Lite, RiskyProject Professional, and RiskyProject Enterprise.
HijackLoader Evolution: Interactive Process HollowingDonato Onofri
CrowdStrike researchers have identified a HijackLoader (aka IDAT Loader) sample that employs sophisticated evasion techniques to enhance the complexity of the threat. HijackLoader, an increasingly popular tool among adversaries for deploying additional payloads and tooling, continues to evolve as its developers experiment and enhance its capabilities.
In their analysis of a recent HijackLoader sample, CrowdStrike researchers discovered new techniques designed to increase the defense evasion capabilities of the loader. The malware developer used a standard process hollowing technique coupled with an additional trigger that was activated by the parent process writing to a pipe. This new approach, called "Interactive Process Hollowing", has the potential to make defense evasion stealthier.
Smart Cities that don't go "bump" in the night: delivering interoperable smar...Rick Robinson
This document discusses concepts related to smart cities and their information modeling. It begins by defining a smart city and outlining some of its key components. It then provides examples of concepts that could be included in an information model for city systems, such as organizations, alerts, incidents, assets, and locations. It also discusses existing standards that could be leveraged for modeling these concepts and provides examples of their current use. Finally, it presents an approach for developing a semantic model called SCRIBE that is aligned with standards, customizable for different city needs, and extensible.
Semantic technologies for the Internet of Things PayamBarnaghi
The document discusses semantic technologies for the Internet of Things. It describes how sensor data in the IoT is time-dependent, continuous, and variable quality. Semantic annotations and machine-interpretable formats like XML and RDF are needed to make the data interoperable. Ontologies provide formal definitions of concepts and relationships in a domain that enable machines to process IoT data and enable autonomous device interactions. The document outlines approaches to semantically describe sensor observations and measurements using XML, RDF graphs, and adding domain concepts and logical rules with ontologies.
Cities are composed of complex systems with physical, cyber, and social components. Current works on extracting and understanding city events mainly rely on technology enabled infrastructure to observe and record events. In this work, we propose an approach to leverage citizen observations of various city systems and services such as traffic, public transport, water supply, weather, sewage, and public safety as a source of city events. We investigate the feasibility of using such textual streams for extracting city events from annotated text. We formalize the problem of annotating social streams such as microblogs as a sequence labeling problem. We present a novel training data creation process for training sequence labeling models. Our automatic training data creation process utilizes instance level domain knowledge (e.g., locations in a city, possible event terms). We compare this automated annotation process to a state-of-the-art tool that needs manually created training data and show that it has comparable performance in annotation tasks. An aggregation algorithm is then presented for event extraction from annotated text. We carry out a comprehensive evaluation of the event annotation and event extraction on a real-world dataset consisting of event reports and tweets collected over four months from San Francisco Bay Area. The evaluation results are promising and provide insights into the utility of social stream for extracting city events.
The document discusses master data management in the city of Gothenburg. It describes challenges around managing structured and unstructured data from multiple systems. It then demonstrates a pilot project that uses semantic standards, models and open vocabularies to link and query master data. Entities like people, organizations, locations and services are modeled. The pilot shows how data can be accessed through a SPARQL endpoint or API to enable interoperability.
The document discusses integrating sensor and social data to understand city events. It describes collecting data from multiple sources, including sensors and social media. Statistical models are used to analyze the sensor data and identify anomalies, which are then correlated with events extracted from social media using spatial and temporal proximity. The approach is evaluated on traffic data from San Francisco, integrating data from traffic sensors and Twitter to extract and corroborate traffic events.
This presentation covers the requirements to get started with HunchLab 2.0's predictive policing system. It starts discussing technical requirements (security, authentication) and then proceeds to discuss guidelines for configuring meaningful predictive models of crime. The presentation concludes with information about related geographic and temporal datasets that are useful in forecasting crime with recommendations on how to prioritize data sets to use in HunchLab.
How to Achieve Cross-Industry Semantic InteroperabilityDoug Migliori
The document discusses achieving cross-industry semantic interoperability through developing a common information model and ontologies. It proposes a "blended" approach that combines concepts from different standards organizations to minimize semantic disparity across industries. This would involve a top-level ontology, an ontology for an information model, and a system ontology to define relationships between business and device systems and support use cases across multiple industries.
The document discusses the concept of service-oriented science and describes several key aspects:
1) People can create services (data, code, instruments), discover and decide whether to use existing services, and compose services to create new functions.
2) Services are hosted by "someone else" so individuals do not need expertise in operating services and computers. It is hoped this entity can manage security, reliability, and scalability.
3) Discovery, composition, publishing, and hosting of services are important aspects that enable service-oriented science. Standards, registries, tagging, and social networks help with discovery, while workflows, containers, and dynamic provisioning support composition and hosting at scale.
Emerging Dynamic TUW-ASE Summer 2015 - Distributed Systems and Challenges for...Hong-Linh Truong
This is a lecture from the advanced service engineering course from the Vienna University of Technology. See http://dsg.tuwien.ac.at/teaching/courses/ase/
NIEM is a data exchange model used across federal, state, local, and tribal governments. Business Architecture Committee (NBAC) June 2009 through November 2013. OmniClass facility types built in to advance public safety.
Why the Brick Schema is a Game Changer for Smart Buildings?Memoori
Why is the Brick Schema a Game Changer for Smart Buildings? Memoori is joined by Founder Members of the Brick Open Schema Project: Gabe Fierro, University of California, Berkley, Jason Koh, University of California, San Diego; and Yiyi Guo from Johnson Controls. Learn about how the Brick Schema can Help Overcome Challenges in Building IoT Data Management.
Accounting Value Effects for Responsible NetworkingGiovanni Sileno
The document discusses approaches to achieving a "responsible internet" through redistributing control and monitoring abilities to users. It identifies two issues with the existing "Responsible Internet" proposal:
1) A "responsibility gap" exists because low-level programmability for users does not address their ability to foresee outcomes or assess actions according to values.
2) The power relationships between users, network operators, and governance bodies are not partially programmable, requiring models of how the world functions and what is valuable, as well as mechanisms for consolidating norms from multiple sources into policies at different levels.
The document discusses using web services for air quality management by accessing distributed data sources and processing the data through filtering, aggregation and fusion. It proposes a service-oriented architecture called DataFed.Net to access, process and deliver air quality information. As an example, it describes how web services could be used to monitor and predict the impact of smoke from fires on particulate matter concentrations.
Distributed Systems: How to connect your real-time applicationsJaime Martin Losa
This document provides an overview of distributed systems and how to connect real-time applications using the Data Distribution Service (DDS) standard. It introduces DDS and its architecture, including topics, instances, keys, quality of service policies. It then demonstrates how to create a basic "hello world" publisher/subscriber example in both eProsima Fast RTPS and RTI Connext DDS middleware in 3 steps: defining the data type, generating code, and building/running the publisher and subscriber.
This document contains information about a course on Information Technology for Managers at IQRA University. The course code is I.T. For Managers. The facilitator is Qazi Shahab Azam who has an MBA. The semester is the 2nd semester. The student name provided is Syed Zamin Ali Shah with student number 8914. The assignment is the mandatory assignment number 2 dated May 6th, 2014.
oneM2M - Management, Abstraction and SemanticsoneM2M
The document discusses concepts related to management, abstraction, and semantics in oneM2M including:
- Management provides unified APIs for configuring, monitoring, and managing devices, applications, and service entities.
- Abstraction hides the complexity of specific technologies by providing a single, unified information model and methods for applications.
- Semantics adds meaning and relationships between concepts to enable machine understandable interoperability.
- oneM2M provides resource models and protocols for management, and attributes for basic semantic annotation. Interworking proxies map non-oneM2M models to common oneM2M resources.
The document provides an overview of the Lightweight Directory Access Protocol (LDAP). It discusses how LDAP defines a standard method for accessing and updating directory information. Key points include:
- LDAP is optimized for read access of information that needs to be accessed from multiple locations and referenced by many applications. It is not well-suited for rapidly changing information.
- The core LDAP protocol defines operations like search, compare, add, delete, and modify to access and update directory entries stored on an LDAP server.
- Directories store typed and ordered information about objects in a hierarchical tree structure defined by distinguished names.
- Popular LDAP server software includes Microsoft Active Directory, Oracle Internet Directory, and Apache Directory Server.
Overview of schedule and cost risk analysis methodology for aerospace industry.
For more information how to perform schedule risk analysis using RiskyProject software please visit Intaver Institute web site: http://www.intaver.com.
About Intaver Institute.
Intaver Institute Inc. develops project risk management and project risk analysis software. Intaver's flagship product is RiskyProject: project risk management software. RiskyProject integrates with Microsoft Project, Oracle Primavera, other project management software or can run standalone. RiskyProject comes in three configurations: RiskyProject Lite, RiskyProject Professional, and RiskyProject Enterprise.
Similar to Connecting the Next Billion Devices to the Internet - Standards and Protocols (20)
HijackLoader Evolution: Interactive Process HollowingDonato Onofri
CrowdStrike researchers have identified a HijackLoader (aka IDAT Loader) sample that employs sophisticated evasion techniques to enhance the complexity of the threat. HijackLoader, an increasingly popular tool among adversaries for deploying additional payloads and tooling, continues to evolve as its developers experiment and enhance its capabilities.
In their analysis of a recent HijackLoader sample, CrowdStrike researchers discovered new techniques designed to increase the defense evasion capabilities of the loader. The malware developer used a standard process hollowing technique coupled with an additional trigger that was activated by the parent process writing to a pipe. This new approach, called "Interactive Process Hollowing", has the potential to make defense evasion stealthier.
Discover the benefits of outsourcing SEO to Indiadavidjhones387
"Discover the benefits of outsourcing SEO to India! From cost-effective services and expert professionals to round-the-clock work advantages, learn how your business can achieve digital success with Indian SEO solutions.
Ready to Unlock the Power of Blockchain!Toptal Tech
Imagine a world where data flows freely, yet remains secure. A world where trust is built into the fabric of every transaction. This is the promise of blockchain, a revolutionary technology poised to reshape our digital landscape.
Toptal Tech is at the forefront of this innovation, connecting you with the brightest minds in blockchain development. Together, we can unlock the potential of this transformative technology, building a future of transparency, security, and endless possibilities.
2. IBM Research
2
You are hearing about the benefits of smart cities
and the Internet of Things
One essential assumption: Interoperability
This talk is about the plumbing that makes it
possible
– Information exchange Standards
–Technologies that can be used with these
standards
2
3. IBM Research
3
Connectivity ≠ Interoperability
Lots of choices of
Information content
IP Layer Strong agreement here
Lots of choices of
Communication Protocols
Zigbee
Wifi
Power line
Not so much attention
Lots of attention
3
4. IBM Research
4
What’s the problem?
Misunderstanding of information flowing between systems
“Set Thruster to 324.59”
September, 1999
4
5. IBM Research
5
Context
Many standards provide context through
supporting documentation for human
implementers to read – and possibly misinterpret.
A good information standard explicitly captures
metadata (context)
– Information model, or
– Semantic model, or
– Formal logic model
UML
OWL
First order logic
5
6. IBM Research
6
Need context (a.k.a. metadata)
“Thruster setting to Impulse with value 324.59 Newton-seconds”
Thruster settingImpulse
Newton-second 324.59
6
7. IBM Research
7
Every API or protocol standard should be supported
by an information model that defines all the relevant
concepts
“Need smart data so you don’t need such smart
software”
– Krzysztof Janowicz , UCSB
7
11. IBM Research
11
Information models for city systems
http://www.ibm.com/developerworks/industry/library/ind-smartercitydatamodel1/index.html
Standard Examples of supported concepts Current deployments
Common Alerting Protocol (CAP) Category, status, scope, certainty, severity, urgency,
onset time, expiration time, response type,
instructions
International standard (OASIS plus ITU-T
Recommendation X.1303) with deployments primarily
in the U.S., which include: Department of Homeland
Security, National Weather Service, United States
Geological Survey, California Office of Emergency
Services, Virginia Department of Transportation, and
Oregon RAINS.
National Information Exchange Model (NIEM) Activity, address, case, date and time, document,
item, incident, location, organization, person
U.S. specific, with deployments that include: U.S.
Department of Homeland Security, U.S. Department
of Justice, U.S. Citizenship and Immigration
Services, Federal Emergency Management Agency
(FEMA), Law Enforcement Information Sharing
Program, Logical Entity eXchange Specifications,
OneDOJ, and the Department of Health and Human
Services.
OpenGIS Geography Markup Language (GML) Point International standard (OGC) that is widely used in
the industry, and considered the reference in its
space.
OpenGIS Location Services (OpenLS) Location International standard (OGC)that is used for location
based applications such as cell phone apps.
SOA Ontology Service, process, task, event, human actor, effect,
system, policy, service contract, service interface,
element
International standard (TOG) that encapsulates SOA
vocabulary and relationships and is used to describe
and model SOA solutions.
Universal Core Entities and assets, events and alerts, people,
organizations, locations, collections
U.S. specific standard that is jointly managed by
Department of Defense, Justice, and Homeland
Security, and the Office of the Director of National
Intelligence. Within the DoD, the Marine Corps, Navy,
and Air Force appear to be committed to supporting
UC.
W3C Time Ontology in OWL Interval, DurationDescription, DateTimeDescription,
DayOfWeek
International standard (W3C) with and uncertain
adoption. Our web search revealed no concrete
deployments.
11
12. IBM Research
12
Organization
– Definition: A group of persons organized for a particular purpose
– Examples: Police department, public housing department, bus department,
transportation agency, water agency, electric utility
– Key attributes: Name, type of organization, description, identification, website
– Key relationships: Organizations (parent-child), assets, location
– Standards assessment: National Information Exchange Model (NIEM): NIEM-
Core (nc:)OrganizationType, UCore (Universal Core) organization
Alert
– Definition: A warning or alarm for an imminent event
– Examples: Road repair advisory
– Key attributes: Sender, description, urgency, severity, certainty, onset time,
location, supporting resources
– Key relationships: Sender (organization or person), location, incident, work
orders
– Standards assessment: The Common Alerting Protocol (CAP) has extensive
support for the alert concept. The UCore Event concepts are also applicable.
Incident
– Definition: An occurrence or an event that may require a response
– Examples: Road repair, automobile accident, water main bursting, criminal
activity
– Key attributes: Date and time of the incident, description, ID
– Key relationships: Location, alerts, work orders, owner (organization or
person)
– Standards assessment: NIEM:nc:IncidentType, CAP:alert:incidents, UCore
Event, Service-Oriented Architecture (SOA) Ontology Event
Person
– Definition: A human being, an individual
– Examples: James, Bob, Sally
– Key attributes: Full name, given name, family name, gender, date of birth,
place of birth, citizenship, country of birth
– Key relationships: Employer, location, address, organization, role (such as
operator, supervisor, responder, analyst, asset manager)
– Standards assessment: NIEM:nc:PersonType, UCore Person, SOA Ontology
Human Actor
Asset
– Definition: A tangible object that can be tracked over time
– Examples: Road, water pipe, electric capacitor, bus, building
– Key attributes: Description, ID
– Key relationships: Organization, person, manufacturer, location, work order,
incident
– Standards assessment: NIEM:ip: AssetType, UCore Entity
Work order
– Definition: An order to do some work; to fix, repair or replace
– Examples: Road repair, utility maintenance on a main valve, re-routing of buses
– Key attributes: Description, ID, comment, priority, status, location, start date/time, stop
date/time
– Key relationships: Work steps, work orders (parent-child), incident, alert, organization,
maintenance history, specification, person, assets
– Standards assessment: No relevant standards identified at this time.
Process and procedure
– Definition: A series of actions to accomplish a goal
– Examples: Road repair notification and coordination
– Key attributes: Process document
– Key relationships: Process steps, work orders, incident, alert, organization, person, assets
– Standards assessment: SOA Ontology Process
Key Performance Indicator
– Definition: A measurement or criteria to assay the condition or performance of a person,
process, or thing
– Examples: Response time, time to closure, cost to city, savings to city, impact to city
services
– Key attributes: Description, metrics, thresholds
– Key relationships: KPI (parent-child), organization, incident, alert, process and procedure,
asset
– Standards assessment: No relevant standards identified at this time.
Location
– Definition: A geographic place, point, position, or area identified by its coordinates in an
earth based coordinate system, name, or address
– Examples: Road repair location: city intersection, water pipe location
– Key attributes: Geographic coordinates, postal address, TimeStamp
– Key relationships: Person, organization, asset, incident, alert
– Standards assessment: NIEM:nc:LocationType, UCore Location, Geography Markup
Language (GML) Point, OpenGIS® Open Location Service (OpenLS) Location
Time
– Definition: Measuring system used to sequence events, to compare the durations of events
and the intervals between them, and to quantify rates of change such as the motions of
objects
– Examples: Start time, end time
– Key attributes: Years, months, weeks, days, hours, minutes, seconds, milliseconds
– Key relationships: Duration
– Standards assessment: NIEM:nc:DateTime, W3C DateTimeDescription
Source: Rick Robinson, Executive Architect, Smarter Cities, IBM
Some Relevant Concepts
12
13. IBM Research
13
Some Smart Grid Standards
CIM 61968
CIM 61970
CIM 62325
61850
ISO 16484
BACnet
61850-410
hydro
Multispeak
C12.19
Abstract Model – Shared Concepts, Fragments???
61850-420
DER
(solar…)61400-25-2
wind
61850-420
DER
(solar…)
WS Calendar
NAESB
energy usage
info
EMIX
OASIS energy
interop
Zigbee smart
energy profile
ISA88
ISA95
CEA 709
LonTalk
62351-7 comm net
and system mgnt
ICCP
IEEE 1815
dnp3
IEEE c37.239
comfede
naspi
ASHRAE
SPC201
FSGIM
OpenADR
13
14. IBM Research
14
IEC 61968
IEC 61970
NAESB PAP10
Multispeak V4.1
“Meter”
Do they all really mean the same thing?
14
14
15. IBM Research
15
What to do?
Require all standards to use the same
universal definitions for all terms
Require all standards to explicitly define all
terms (in machine-readable form)
Use technology to automatically map
between terms in one standard to terms in
another
15
16. IBM Research
16
Semantic data integration for Internet of Things
CMUSV
Sensor Data
NASA
Sensor Data
NICT
(Japan)
Sensor Data
•Retrieve data
•Import or synthesize ontologies
•Map to general ontology
Technologies
RDF
OWL
SPARQL
SPIN
Also Enables
Inferencing over data
Federated data stores
Distributed queries
….
16
21. IBM Research
21
Querying DBpedia
21
SELECT DISTINCT ?buildingName ?floors
where {
?building a ?o .
?o rdfs:subClassOf ?class.
FILTER (regex(?o, "dubai","i")) .
?building dbpprop:floorCount ?floors .
FILTER (?floors > 80) .
FILTER ((datatype(?floors)) = xsd:integer) .
?building dbpprop:name ?buildingName .
}
ORDER BY ?floors
(Show all entries for buildings with more than 80 floors)
22. IBM Research
22
Interoperability Standards battles
AT&T, Cisco, IBM, Intel…
plus ~165 others
Intel, Samsung, GE, Cisco, Dell…
plus ~50 others
Microsoft, Cisco,
Qualcomm, Sony, LG…
plus ~140 others
Thread Group
Google (Nest), ARM,
Samsung appliance…
plus ~120 others
Apple Homekit
“dozens of partners”
as of May 14, 2015Hewlett-Packard, Cisco,
IBM, Oracle, Philips
Plus ~400 others
22
23. IBM Research
23
Takeaway for Smart Cities
We need to share information
across different contexts
Information needs to be defined in a
way that machines can understand
the meaning
A neutral model and semantic
mappings give us a way to manage
complexity
Semantic web technology is well-
suited to loosely-coupled,
distributed, linked communities of
systems
23