OWL 2 adds several new features to OWL including:
1) Cleaner language design with axiom-centered structural specification and functional style syntax.
2) Increased expressiveness through properties such as property chains, qualified cardinality restrictions, and datatype restrictions on properties.
3) Enhanced datatypes including new datatypes, datatype definitions, and data range combinations.
3) Profiles such as OWL 2 EL, QL, and RL that provide different tradeoffs between expressiveness and reasoning complexity.
A non-technical explanation of the main ideas and notions in OWL.This talk was also recorded on video, and is available on-line at http://videolectures.net/koml04_harmelen_o/
RDF Constraint Checking using RDF Data Descriptions (RDD)Alexander Schätzle
Linked Open Data (LOD) sources on the Web are increas-
ingly becoming a mainstream method to publish and con-
sume data. For real-life applications, mechanisms to de-
scribe the structure of the data and to provide guarantees
are needed, as recently emphasized by the W3C in its Data
Shape Working Group. Using such mechanisms, data providers will be able to validate their data, assuring that it is structured in a way expected by data consumers. In turn, data consumers can design and optimize their applications to match the data format to be processed.
In this paper, we present several crucial aspects of RDD,
our language for expressing RDF constraints. We introduce
the formal semantics and describe how RDD constraints can be translated into SPARQL for constraint checking. Based on our fully working validator, we evaluate the feasibility and eciency of this checking process using two popular, state-of-the-art RDF triple stores. The results indicate that even a naive implementation of RDD based on SPARQL 1.0 will incur only a moderate overhead on the RDF loading process, yet some constraint types contribute an outsize share and scale poorly. Incorporating several preliminary optimizations, some of them based on SPARQL 1.1, we provide insights on how to overcome these limitations.
A non-technical explanation of the main ideas and notions in OWL.This talk was also recorded on video, and is available on-line at http://videolectures.net/koml04_harmelen_o/
RDF Constraint Checking using RDF Data Descriptions (RDD)Alexander Schätzle
Linked Open Data (LOD) sources on the Web are increas-
ingly becoming a mainstream method to publish and con-
sume data. For real-life applications, mechanisms to de-
scribe the structure of the data and to provide guarantees
are needed, as recently emphasized by the W3C in its Data
Shape Working Group. Using such mechanisms, data providers will be able to validate their data, assuring that it is structured in a way expected by data consumers. In turn, data consumers can design and optimize their applications to match the data format to be processed.
In this paper, we present several crucial aspects of RDD,
our language for expressing RDF constraints. We introduce
the formal semantics and describe how RDD constraints can be translated into SPARQL for constraint checking. Based on our fully working validator, we evaluate the feasibility and eciency of this checking process using two popular, state-of-the-art RDF triple stores. The results indicate that even a naive implementation of RDD based on SPARQL 1.0 will incur only a moderate overhead on the RDF loading process, yet some constraint types contribute an outsize share and scale poorly. Incorporating several preliminary optimizations, some of them based on SPARQL 1.1, we provide insights on how to overcome these limitations.
Integrating a Domain Ontology Development Environment and an Ontology Search ...Takeshi Morita
In order to reduce the cost of building domain ontologies manually, in this paper, we propose a method and a tool named DODDLE-OWL for domain ontology construction reusing texts and existing ontologies extracted by an ontology search engine: Swoogle. In the experimental evaluation, we applied the method to a particular field of law and evaluated the acquired ontologies.
Comparison of features between ShEx (Shape Expressions) and SHACL (Shapes Constraint Language)
Changelog:
11/06/17
- Removed slides about compositionality
31/May/2017
- Added slide 30 about validation report
- Added slide 32 about stems
- Changed slides 7 and 8 adapting compact syntax to new operator .
23/05/2017:
Slide 14: Repaired typos in typos in sh:entailment, rdfs:range
21/05/2017:
- Slide 8. Changed the example to be an IRI and a datatype
- Added typically in slide 9
- Slide 10: Removed the phrase: "Target declarations can problematic when reusing/importing shapes"
and created slide 27 to talk about reuability
- Added slide 11 to talk about the differences in triggering validation
- Created slide 14 to talk about inference
- Renamed slide 15 as "Inference and triggering mechanism"
- Added slides 27 and 28 to talk about reuability
- Added slide 29 to talk about annotations
18/05/2017
- Slides 9 now includes an example using ShEx RDF vocabulary
- Slide 10 now says that target declarations are optional
- Slide 13 now says that some RDF Schema terms have special treatment in SHACL
- Example in slide 18 now uses sh:or instead of sh:and
- Added slides 22, 23 and 24 which show some features supported by SHACL but not supported by ShEx (property pair constraints, uniqueLang and owl:imports)
Introduction to search engine-building with LuceneKai Chan
These are the slides for the session I presented at SoCal Code Camp Los Angeles on October 14, 2012.
http://www.socalcodecamp.com/session.aspx?sid=a4774b3c-7a2d-45db-8721-f54c5a314e17
Introduction to libre « fulltext » technologyRobert Viseur
The presentation will be based on my personal experience on SQLite, MySQL and Zend Search ; on workshops I’ve attended (PostgreSQL) and on tests conducted under my supervision (PostgreSQL, MySQL, Sphinx, Lucene, Xapian). It will cover an exhaustive overview of existing techniques, from the most basic to the more advanced, and will lead to a comparative table of the existing technology.
Introduction to search engine-building with LuceneKai Chan
These are the slides for the session I presented at SoCal Code Camp San Diego on June 24, 2012.
http://www.socalcodecamp.com/session.aspx?sid=f9e83f56-3c56-4aa1-9cff-154c6537ccbe
This talk was given by FORTH, Greece, at the European Data Forum (EDF) 2012 took place on June 6-7, 2012 in Copenhagen (Denmark) at the Copenhagen Business School (CBS).
Abstract:
Given the increasing amount of sensitive RDF data available on the Web, it becomes increasingly critical to guarantee secure access to this content. Access control is complicated when RDFS inference rules and other dependencies between access permissions of triples need to be considered; this is necessary, e.g., when we want to associate the access permissions of inferred triples with the ones that implied it. In this paper we advocate the use of abstract provenance models that are defined by means of abstract tokens operators to support fine grained access control for RDF graphs. The access label of a triple is a complex expression that encodes how said label was produced (i.e., the triples that contributed to its computation). This feature allows us to know exactly the effects of any possible change, thereby avoiding a complete recomputation of the labels when a change occurs. In addition, the same application can choose to enforce different access control policies or, different applications can enforce different policies on the same data, avoiding the recomputation of the label of a triple. Preliminary experiments have shown the applicability and benefits of our approach.
Doctoral Examination at the Karlsruhe Institute of Technology (08.07.2016)Dr.-Ing. Thomas Hartmann
In this thesis, a validation framework is introduced that enables to consistently execute RDF-based constraint languages on RDF data and to formulate constraints of any type. The framework reduces the representation of constraints to the absolute minimum, is based on formal logics, consists of a small lightweight vocabulary, and ensures consistency regarding validation results and enables constraint transformations for each constraint type across RDF-based constraint languages.
Presentation at the ESWC 2011 PhD Symposium in May 2011, by Michael Schneider, FZI. Included are backup slides that have not been presented at the event. The corresponding PhD proposal can be found in the ESWC proceedings at <http: />. Alternatively, the PhD proposal can be downloaded from <http: />.
OWL stands for Web Ontology Language
OWL is built on top of RDF
OWL is for processing information on the web
OWL was designed to be interpreted by computers
OWL was not designed for being read by people
OWL is written in XML
OWL has three sublanguages
- OWL Lite , OWL DL , OWL Full
OWL is a W3C standard
Integrating a Domain Ontology Development Environment and an Ontology Search ...Takeshi Morita
In order to reduce the cost of building domain ontologies manually, in this paper, we propose a method and a tool named DODDLE-OWL for domain ontology construction reusing texts and existing ontologies extracted by an ontology search engine: Swoogle. In the experimental evaluation, we applied the method to a particular field of law and evaluated the acquired ontologies.
Comparison of features between ShEx (Shape Expressions) and SHACL (Shapes Constraint Language)
Changelog:
11/06/17
- Removed slides about compositionality
31/May/2017
- Added slide 30 about validation report
- Added slide 32 about stems
- Changed slides 7 and 8 adapting compact syntax to new operator .
23/05/2017:
Slide 14: Repaired typos in typos in sh:entailment, rdfs:range
21/05/2017:
- Slide 8. Changed the example to be an IRI and a datatype
- Added typically in slide 9
- Slide 10: Removed the phrase: "Target declarations can problematic when reusing/importing shapes"
and created slide 27 to talk about reuability
- Added slide 11 to talk about the differences in triggering validation
- Created slide 14 to talk about inference
- Renamed slide 15 as "Inference and triggering mechanism"
- Added slides 27 and 28 to talk about reuability
- Added slide 29 to talk about annotations
18/05/2017
- Slides 9 now includes an example using ShEx RDF vocabulary
- Slide 10 now says that target declarations are optional
- Slide 13 now says that some RDF Schema terms have special treatment in SHACL
- Example in slide 18 now uses sh:or instead of sh:and
- Added slides 22, 23 and 24 which show some features supported by SHACL but not supported by ShEx (property pair constraints, uniqueLang and owl:imports)
Introduction to search engine-building with LuceneKai Chan
These are the slides for the session I presented at SoCal Code Camp Los Angeles on October 14, 2012.
http://www.socalcodecamp.com/session.aspx?sid=a4774b3c-7a2d-45db-8721-f54c5a314e17
Introduction to libre « fulltext » technologyRobert Viseur
The presentation will be based on my personal experience on SQLite, MySQL and Zend Search ; on workshops I’ve attended (PostgreSQL) and on tests conducted under my supervision (PostgreSQL, MySQL, Sphinx, Lucene, Xapian). It will cover an exhaustive overview of existing techniques, from the most basic to the more advanced, and will lead to a comparative table of the existing technology.
Introduction to search engine-building with LuceneKai Chan
These are the slides for the session I presented at SoCal Code Camp San Diego on June 24, 2012.
http://www.socalcodecamp.com/session.aspx?sid=f9e83f56-3c56-4aa1-9cff-154c6537ccbe
This talk was given by FORTH, Greece, at the European Data Forum (EDF) 2012 took place on June 6-7, 2012 in Copenhagen (Denmark) at the Copenhagen Business School (CBS).
Abstract:
Given the increasing amount of sensitive RDF data available on the Web, it becomes increasingly critical to guarantee secure access to this content. Access control is complicated when RDFS inference rules and other dependencies between access permissions of triples need to be considered; this is necessary, e.g., when we want to associate the access permissions of inferred triples with the ones that implied it. In this paper we advocate the use of abstract provenance models that are defined by means of abstract tokens operators to support fine grained access control for RDF graphs. The access label of a triple is a complex expression that encodes how said label was produced (i.e., the triples that contributed to its computation). This feature allows us to know exactly the effects of any possible change, thereby avoiding a complete recomputation of the labels when a change occurs. In addition, the same application can choose to enforce different access control policies or, different applications can enforce different policies on the same data, avoiding the recomputation of the label of a triple. Preliminary experiments have shown the applicability and benefits of our approach.
Doctoral Examination at the Karlsruhe Institute of Technology (08.07.2016)Dr.-Ing. Thomas Hartmann
In this thesis, a validation framework is introduced that enables to consistently execute RDF-based constraint languages on RDF data and to formulate constraints of any type. The framework reduces the representation of constraints to the absolute minimum, is based on formal logics, consists of a small lightweight vocabulary, and ensures consistency regarding validation results and enables constraint transformations for each constraint type across RDF-based constraint languages.
Presentation at the ESWC 2011 PhD Symposium in May 2011, by Michael Schneider, FZI. Included are backup slides that have not been presented at the event. The corresponding PhD proposal can be found in the ESWC proceedings at <http: />. Alternatively, the PhD proposal can be downloaded from <http: />.
OWL stands for Web Ontology Language
OWL is built on top of RDF
OWL is for processing information on the web
OWL was designed to be interpreted by computers
OWL was not designed for being read by people
OWL is written in XML
OWL has three sublanguages
- OWL Lite , OWL DL , OWL Full
OWL is a W3C standard
Presentation of the paper "Reasoning in the OWL 2 Full Ontology Language using First-Order Automated Theorem Proving" by Michael Schneider, FZI Karlsruhe, and Geoff Sutcliffe, University of Miami, at the 23rd International Conference on Automated Deduction (CADE 23), August 2011.
Although animals do not use language, they are capable of many of the same kinds of cognition as us; much of our experience is at a non-verbal level.
Semantics is the bridge between surface forms used in language and what we do and experience.
Language understanding depends on world knowledge (i.e. “the pig is in the pen” vs. “the ink is in the pen”)
We might not be ready for executives to specify policies themselves, but we can make the process from specification to behavior more automated, linked to precise vocabulary, and more traceable.
Advances such as SVBR and an English serialization for ISO Common Logic means that executives and line workers can understand why the system does certain things, or verify that policies and regulations are implemented
Structured Dynamics provides 'ontology-driven applications'. Our product stack is geared to enable the semantic enterprise. The products are premised on preserving and leveraging existing information assets in an incremental, low-risk way. SD's products span from converters to authoring environments to Web services middleware and to eventual ontologies and user interfaces and applications.
The formulation of constraints and the validation of RDF data against these constraints is a common requirement and a much sought-after feature, particularly as this is taken for granted in the XML world. Recently, RDF validation as a research field gained speed due to shared needs of data practitioners from a variety of domains. For constraint formulation and RDF data validation, several languages exist or are currently developed. Yet, none of the languages is able to meet all requirements raised by data professionals.
We have published a set of constraint types that are required by diverse stakeholders for data applications. We use these constraint types to gain a better understanding of the expressiveness of solutions, investigate the role that reasoning plays in practical data validation, and give directions for the further development of constraint languages.
We introduce a validation framework that enables to consistently execute RDF-based constraint languages on RDF data and to formulate constraints of any type in a way that mappings from high-level constraint languages to an intermediate generic representation can be created straight-forwardly. The framework reduces the representation of constraints to the absolute minimum, is based on formal logics, and consists of a very simple conceptual model with a small lightweight vocabulary. We demonstrate that using another layer on top of SPARQL ensures consistency regarding validation results and enables constraint transformations for each constraint type across RDF-based constraint languages.
Managing Metadata for Science and Technology Studies: the RISIS caseRinke Hoekstra
Presentation of our paper at the WHISE workshop at ESWC 2016 on requirements for metadata over non-public datasets for the science & technology studies field.
Prov-O-Viz is a visualisation service for provenance graphs expressed using the W3C PROV vocabulary. It uses the Sankey-style visualisation from D3js.
See http://provoviz.org
Linkitup: Link Discovery for Research DataRinke Hoekstra
Linkitup is a Web-based dashboard for enrichment of research output published via industry grade data repository services. It takes metadata entered through Figshare.com and tries to find equivalent terms, categories, persons or entities on the Linked Data cloud and several Web 2.0 services. It extracts references from publications, and tries to find the corresponding Digital Object Identifier (DOI). Linkitup feeds the enriched metadata back as links to the original article in the repository, but also builds a RDF representation of the metadata that can be downloaded separately, or published as research output in its own right. In this paper, we compare Linkitup to the standard workflow of publishing linked data, and show that it significantly lowers the threshold for publishing linked research data.
A Network Analysis of Dutch Regulations - Using the Metalex Document ServerRinke Hoekstra
In this paper we explore the possibilities of using the Linked Data representation of all Dutch regulations stored in the MetaLex Doc- ument Server for the purposes of network analysis over the citation graph between regulations, both at the document level, and at the article level. We show that this is possible using relatively straightforward SPARQL queries, and present preliminary results of the analysis.
A Network Analysis of Dutch Regulations. Rinke Hoekstra. figshare.
http://dx.doi.org/10.6084/m9.figshare.689880
Retrieved 11:12, Oct 07, 2013 (GMT)
This presentation describes the use by Data2Semantics (http://www.data2semantics.org) of the VIVO portal (http://vivoweb.org) for interlinking researchers contributing to projects within the COMMIT programme (http://www.commit-nl.nl).
The Data2Semantics project (COMMIT P23) is all about enriching research data, and making it more reusable for future research. Using Linked Data for this task is a fairly obvious step to make (surprise!). However, there are several shortcomings the current practices in publishing Linked Data, that calls for a slightly
different approach which (hopefully) bridges a gap between Web 2.0 and Web 3.0. I will present a proof-of-concept service (Linkitup) that works on top of existing scientific data repositories, and allows individual researchers to enrich their data with additional (linked) metadata.
Talk about the use of Linked Data in historical research on census data. Has some slides about TabLInker as well (http://github.com/Data2Semantics/TabLinker). Part of the data2semantics project (http://data2semantics.org)
Presentatie voor de Belastingdienst in het kader van een onderzoek naar de (on)mogelijkheden rond het herkennen en extraheren van concepten en hun definities, en het representeren daarvan met Semantic Web standaarden.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
2. Why this topic? (… someone asked me …) Take home message “Sure, not everything about OWL 2 is great, but it does add some very nice new features that we can all use and learn to love”
3. Playing The Devil’s Advocate Where’s the Web in OWL? Where’s the Ontology in OWL? “OL” or “WL” OWL DL and OWL Full “OWLDLED” “OWL is a description logic” OWL and Rules “Rules are just more intuitive” “People think in rules” OWL and Philosophy “OWL is philosophically flawed” OWL 2 DL and reasoning “Decidability is hugely overrated” “Consistency does not exist on the web” “OWL reasoners even die on very small knowledge bases” “I only need part of OWL, so why implement a fully OWL compliant reasoner” Expressiveness “OWL is not expressive enough for my needs” “OWL is way too expressive, no-one will ever need that” “The only useful addition of OWL to RDF is owl:sameAs”
4. DISCLAIMER Do not be confused by OWL 2 (or any other W3C standard) In the end, every standard is a compromise; the result of a `political’ debate between different communities, and not technical insight. Compatible revisions of existing standards inherit political issues, misconceptions, and then add some of their own It’s just that if the communities are technical, you end up with a `technical’ standard.
5. DISCLAIMER For OWL 2 this means: Replaces OWL 1, but is compatible Species inheritance, including OWL DL vs. OWL Full debate Compatibility with other W3C standards Social ‘issues’ with WG: Over-representation of DL community Under-representation of RDF/SW community
6. Economics of OWL 2 Technology push Advancements in Description Logics research Market pull Experiences Added expressiveness Other syntaxes Better (‘easier’) tool development Caters for several communities HC, LS, KR, SW, Engineering, Enterprise Systems
7. Background OWLED workshops (60-70 people) First one in 2005 Users, industry, research W3C Member submission: OWL 1.1 December 2006, following vote at OWLED 2006 OWL Working Group November 2007, following vote at OWLED 2007 OWL 2 Recommendation October 2009
11. Back on topic… Language Design Profiles Exchange Syntaxes Nifty Features Datatypecoolness Properties & Restrictions Syntactic Sugar Punning Annotations Bonus material
12. Language Design (1) OWL 1: Abstract Syntax Frame-based DL: axioms, Full: rules… then why frames? Hard to use for defining semantics to parse to extend “an OWL ontology is an RDF graph” OWL 2: Structural Specification Axiom centred UML/MOF data model “an OWL 2 ontology is an instance O of the Ontology UML class” “any OWL 2 ontology can also be viewed as an RDF graph” OWL 2: Functional Style Syntax BNF grammar http://www.w3.org/TR/owl2-syntax/
13. Language Design (2) OWL 1: Species Lite, DL, Full Confusion between semantics and syntax OWL Lite? Nah… OWL 2: Semantics Direct Semantics (DL), http://www.w3.org/TR/owl-direct-semantics RDF-Based Semantics (Full), http://www.w3.org/TR/owl-rdf-based-semantics Most OWL 2 DL ontologies are OWL 1 Full
14. Profiles OWLs living in the swamps of Amsterdam OWL 2 EL Polynomial time algorithms for standard reasoning tasks; Large ontologies (TBox) OWL 2 QL Conjunctive query answering in in LogSpace using RDB technology; Lightweight ontologies that organize many individuals Access the data directly via relational queries (e.g., SQL). OWL 2 RL(a.k.a. RDFS 3.0 ?) Polynomial time algorithms using rule-extended DB technologies Lightweight ontologies that organize many individuals Operate directly on RDF triples Rule set provided by specification Semantics follows from syntactic restrictions Extra “global restrictions” for OWL 2 DL, QL and EL Extensible!
15. Exchange Syntaxes OWL 1: RDF/XML (2004) W3C Note: OWL XML Syntax (2003) OWL 2: RDF/XML (mandatory) Turtle Functional Style Syntax OWL XML (2009) (+ mandatory GRDDL transformation) Manchester Syntax
16. Hey, show me those nifty features already! Yeah yeah…
17. Datatypes (1) Extended XML Schema compatibility New datatypes not in XML Schema owl:real, owl:rational Datatype definitions xsd:minInclusive, xsd:maxInclusive, xsd:minExclusive, xsd:maxExclusive xsd:pattern (e.g. regular expressions), xsd:length rdf:PlainLiteral(together with RIFWG) All RDF plain literals Not to be used in syntaxes that already deal with RDF plain literals DatatypeDefinition( a:SSN DatatypeRestriction(xsd:stringxsd:pattern "[0-9]{3}-[0-9]{2}-[0-9]{4}" ))
18. Datatypes(2) Datatype Definitions Data Range Combinations Keys Only hold for named individuals DatatypeDefinition( :adultAgeDatatypeRestriction(xsd:integerminInclusive 18) DataComplementOf( :adultAge) DataUnionOf( :adultAge :studentAge) … HasKey( :Transplantation :donorId :recipientId :ofOrgan)
19. Datatypes (3) N-arydatatypes Extension (Working Group Note)http://www.w3.org/TR/owl2-dr-linear/ Linear equations DataAllValuesFrom ( :meltingPoint :boilingPointDataComparison(Arguments(xy) leq( xy ))))
20. Properties (1) Property Types Asymmetric properties Reflexive and Irreflexive properties Top and bottom properties Property chains SubObjectPropertyOf( ObjectPropertyChain( a:hasMothera:hasSister ) a:hasAunt)
22. Just an illustration (three, actually) SubObjectPropertyOf( ObjectPropertyChain ( a:isElephantowl:TopObjectProperty a:isMouse ) a:likes )
23.
24. Punning (wordplay) Any name can be used for any type of entity Direct Semantics Interpreted as separate entities RDF-Based Semantics Interpreted as the same entity … but no punning between: Datatype and Class names Data-, Object- and Annotation property names(actually supported by most implementations) Consequence Strongly typed syntax (FS, OWLXML) …but not in RDF graphs
25.
26. Imports & Versioning Import by location … but comes down to ‘just’ dereferencing OntologyIRI and VersionIRI Ontologies should be accessible at OntologyIRI If no VersionIRI supplied or if it is the latest version VersionIRI If a VersionIRI is supplied Import statement may point to either
27. Other things… Internationalized Resource Identifiers BNodes are existentials Global restrictions for Direct Semantics Anonymous individuals are BNodes … no change in RDF Declarations Indicate what ontology defines an entity … but mostly just nice for parsers, no change in RDF ObjectPropertyAssertion(<http://example.org/p> <http://example.org/a> _:http://example.org/#genid-x) ClassAssertion(ObjectSomeValuesFrom(<http://example.org/p> owl:Thing) <http://example.org/a>)
28. Bonus Material Pretty decent outreach material Comprehensive OWL 2 Overviewhttp://www.w3.org/TR/owl-overview/ OWL 2 Quick Reference Cardhttp://www.w3.org/TR/owl-quick-reference/ OWL 2 Primerhttp://www.w3.org/TR/owl-primer/ OWL 2 New Features and Rationalehttp://www.w3.org/TR/owl-new-features/ OWL 2 Conformancehttp://www.w3.org/TR/owl-conformance
29. What I like about OWL 2 Cleaner language design Added expressiveness Properties Datatypes Increased compatibility between Full and DL Punning Annotation properties Profiles … most notably OWL 2 RL … hooks for extensibility