Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.



Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this


  1. 1. Semantic Evaluation At Large Scale (SEALS)Stuart N. Wrigley, Raúl García-Castro, Lyndon Nixon + The SEALS Consortium WWW 2012 Speaker: Raúl García Lyon, France Castro 18th April 2012
  2. 2. The SEALS Project (RI-238975) Project Coordinator: EC contribution: Duration: Asunción Gómez Pérez 3.500.000 € June 2009-May 2012 <> Universidad Politécnica de Madrid, Spain (Coordinator) University of Sheffield, UK University of Mannheim, Germany 3 2 Forschungszentrum Informatik, Germany University of Zurich, Switzerland 1 1 2 University of Innsbruck, Austria STI International, Austria Institut National de Recherche en Open University, UK 1 Informatique et en Automatique, France Oxford University, UKRaúl García Castro WWW 2012. 18th April 2012 2
  3. 3. Semantic Technology Evaluation @ SEALS SEALS Evaluation SEALS Platform Services SEALS Evaluation Campaigns SEALS CommunityRaúl García Castro WWW 2012. 18th April 2012 3
  4. 4. The SEALS entities Evaluations Tools Results Raw Results Ontology engineering Test Data Interpretations Storage and reasoning Ontology matching Semantic search Semantic web service 12.04.2012 4Raúl García Castro WWW 2012. 18th April 2012 4
  5. 5. Structure of the SEALS Entities • Java Binaries • BPEL • Shell scripts • Ontologies • Java Binaries • Bundles Entity Discovery, Exploitation Metadata Data Validation SEALS Ontologiesúl García Castro WWW 2012. 18th April 2012 5
  6. 6. SEALS Logical Architecture S O Evaluation Organisers A Technology Providers Technology Adopters SEALS Portal Runtime SEALS Evaluation Service Manager Service Software agents SEALS Repositories Test Data Tools Results Evaluation Repository Repository Repository Descriptions Service Service Service Repository ServiceRaúl García Castro WWW 2012. 18th April 2012 6
  7. 7. Challenges • Tool heterogeneity - Hardware requirements Virtualization - Software requirements as a • Reproducibility technology - Ensure execution environment offers enabler the same initial status Processing Node Execution Physical Computing Guest Resource Node Virtualization Solution • VMWare Server 2.0.2 RES Core • VMWare vSphere 4 • Amazon EC2 (In progress) Guest Virtual Machine Tool RES Worker …Raúl García Castro WWW 2012. 18th April 2012 7
  8. 8. SEALS Evaluation Services Ontology Ontology Ontology Semantic Semantic web engineering reasoning matching search service • Conformance DL reasoning: • Matching • Search accuracy, • SWS Discovery • Interoperability • Classification accuracy efficiency • Scalability • Class • Matching (automated) satisfiability accuracy • Usability, Evaluations • Ontology multilingual satisfaction (user- satisfiability • Scalability in-the-loop) • Entailment (ontology size, • Non-entailment # CPU) • Instance retrieval RDF reasoning: • Conformance Conformance & DL reasoning: • Benchmark Automated: • OWLS-TC 4.0 interoperability: • Gardiner test • Anatomy • EvoOnt • SAWSDL-TC 3.0 • RDF(S) suite Conference • MusicBrainz • WSMO-LITE-TC • OWL Lite, DL • Wang et al. • MultiFarm (from QALD-1) and Full repository • Large Biomed User-in-the-loop: Test Data • OWL 2 • Versions of (supported by • Mooney Expressive x3 GALEN SEALS) • Moonet + • OWL 2 Full • Ontologies Scalability: from EU • Real-world projects • LUBM • Instance • Real-world + retrieval test • LUBM + data RDF reasoning: • OWL 2 Full 8Raúl García Castro WWW 2012. 18th April 2012
  9. 9. Evaluation Campaign Methodology SEALS • SEALS-independent Methodology for Evaluation Campaigns • Includes: - Actors Raúl García-Castro and Stuart N. Wrigley - Process - Recommendations - Alternatives - Terms of participation September 2011 - Use rights INITIATION INVOLVEMENT DISSEMINATION PREPARATION & EXECUTION FINALIZATIONRaúl García Castro WWW 2012. 18th April 2012 9
  10. 10. 1st Evaluation Campaign Campaign Tool Provider Country 29 tools from Ontology engineering Jena HP Labs UK Sesame Aduna Netherlands 8 countries Protégé 4 University of Stanford USA Protégé OWL University of Stanford USA NEON toolkit NEON Foundation Europe OWL API University of Manchester UK Reasoning HermiT University of Oxford UK jcel Tec. Universitat Dresden Germany FaCT++ University of Manchester UK Matching AROMA INRIA France ASMOV INFOTECH Soft USA Aroma Nantes University France Falcon-AO Southeast University China Lily Southeast University China RiMOM Tsinghua University China The state of Mapso FZI Germany semantic CODI University of Mannheim Germany technology today – Overview of the AgreeMaker Advances in Computing Lab USA First SEALS Gerome* RWTH Aachen Germany Evaluation Ef2Match Nanyang Tec. University China Campaigns Semantic search K-Search K-Now Ltd UK Raúl García-Castro, Mikalai Yatskevich, Cássia Trojahn dos Santos, Stuart N. Ginseng University of Zurich Switzerland Wrigley, Liliana Cabral, Lyndon Nixon and Ond!ej " váb-Zamazal NLP-Reduce University of Zurich Switzerland PowerAqua KMi, Open University UK Jena Arq HP Labs, Talis UK Semantic web service 4 OWLS-MX variants DFKI Germany April 2011Raúl García Castro WWW 2012. 18th April 2012 10
  11. 11. 2nd Evaluation Campaign NOW June Sept Jan May June 2011 2011 2012 2012 2012 Provide requirements Comment on evaluations and test data Join the Evaluation Campaign! Definition of evaluations Run your own evaluations and test data Announcemen Launch of Results of Results t at ESWC 2nd 2nd presented at 2011 Campaign Campaign ESWC 2012Raúl García Castro WWW 2012. 18th April 2012 11
  12. 12. Community services • Portal • Public wiki • Blog • Forums • Twitter • Private wiki • Mailing lists • Private Dissemina- Collabora- mailing lists tion tion services services Community materials • Whitepaper • Deliverables • Papers • TutorialRaúl García Castro WWW 2012. 18th April 2012 12
  13. 13. Evaluation services Tools Results Execute Evaluations Test data evaluations My tool My results Exploit results Evaluations Tools My test data My results Update them Tools Test data Or define My evaluation My results your own My tool My test dataRaúl García Castro WWW 2012. 18th April 2012 13
  14. 14. Conclusions • The SEALS Platform will be maintained 3 years after the end of the project: June 2015 (at least) • The SEALS Platform facilitates: - Comparing tools under common settings - Reproducibility of evaluations - Reusing evaluation resources, completely or partially - Or defining new ones - Managing evaluation resources using platform services - Computational resources for demanding evaluations • Don’t start your evaluation from scratch! 12.04.2012 14Raúl García Castro WWW 2012. 18th April 2012 14
  15. 15. Contribute to the SEALS Community!Open Source development community WWW 2012 Speaker: Raúl García Lyon, France Castro 18th April 2012