Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

EKAW - Publishing with Triple Pattern Fragments

135 views

Published on

Slides for the presentation on Publishing with Triple Pattern Fragments in the Modeling, Generating and Publishing knowledge as Linked Data tutorial at EKAW 2016.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

EKAW - Publishing with Triple Pattern Fragments

  1. 1. Publishing with Triple Pattern Fragments Ruben Taelman - @rubensworks imec - Ghent University 1
  2. 2. Publishing with Triple Pattern Fragments TPF server software TPF client-side querying Quad Pattern Fragments Demo 2
  3. 3. Publishing with Triple Pattern Fragments TPF server software TPF client-side querying Quad Pattern Fragments Demo 3
  4. 4. http://linkeddatafragments.org/software/ JavaScript LDF server Python Perl Ruby PHP Java ... TPF server implementations in different languages 4
  5. 5. Requires Node.js ≥ 4.0 Installing and running the LDF server with Node 5 [sudo] npm install -g ldf-server ldf-server config.json <port> <workers> Documentation: https://github.com/LinkedDataFragments/Server.js
  6. 6. Requires Docker Installing and running the LDF server with Docker 6 docker pull linkeddatafragments/server.js docker run -p <port>:3000 -it --rm -v $(pwd)/config.json:/tmp/config.json ldf-server /tmp/config.json
  7. 7. Preconfigured LDF server with NGINX cache and web-client Setting up a full stack with Docker (Compose) 7 docker-compose up https://github.com/LinkedDataFragments/FullStackServer
  8. 8. { "title": "My Linked Data Fragments server", "datasources": { "dbpedia": { "title": "DBpedia 2014", "type": "HdtDatasource", "description": "DBpedia 2014 with an HDT back-end", "settings": { "file": "data/dbpedia2014.hdt" } }, "dbpedia-sparql": { "title": "DBpedia 3.9 (Virtuoso)", "type": "SparqlDatasource", "description": "DBpedia 3.9 with a Virtuoso back-end", "settings": { "endpoint": "http://dbpedia.restdesc.org/", "defaultGraph": "http://dbpedia.org" } } } } LDF server is configured with config.json 8
  9. 9. { "title": "My Linked Data Fragments server", "datasources": { "dbpedia": { "title": "DBpedia 2014", "type": "HdtDatasource", "description": "DBpedia 2014 with an HDT back-end", "settings": { "file": "data/dbpedia2014.hdt" } }, "dbpedia-sparql": { "title": "DBpedia 3.9 (Virtuoso)", "type": "SparqlDatasource", "description": "DBpedia 3.9 with a Virtuoso back-end", "settings": { "endpoint": "http://dbpedia.restdesc.org/", "defaultGraph": "http://dbpedia.org" } } } } Configure list of datasources 9
  10. 10. { "title": "My Linked Data Fragments server", "datasources": { "dbpedia": { "title": "DBpedia 2014", "type": "HdtDatasource", "description": "DBpedia 2014 with an HDT back-end", "settings": { "file": "data/dbpedia2014.hdt" } }, "dbpedia-sparql": { "title": "DBpedia 3.9 (Virtuoso)", "type": "SparqlDatasource", "description": "DBpedia 3.9 with a Virtuoso back-end", "settings": { "endpoint": "http://dbpedia.restdesc.org/", "defaultGraph": "http://dbpedia.org" } } } } Load an HDT file 10
  11. 11. { "title": "My Linked Data Fragments server", "datasources": { "dbpedia": { "title": "DBpedia 2014", "type": "HdtDatasource", "description": "DBpedia 2014 with an HDT back-end", "settings": { "file": "data/dbpedia2014.hdt" } }, "dbpedia-sparql": { "title": "DBpedia 3.9 (Virtuoso)", "type": "SparqlDatasource", "description": "DBpedia 3.9 with a Virtuoso back-end", "settings": { "endpoint": "http://dbpedia.restdesc.org/", "defaultGraph": "http://dbpedia.org" } } } } Act as a proxy to a SPARQL endpoint 11
  12. 12. HDT N-Triples Turtle JSON-LD SPARQL-endpoint Different data sources are possible 12
  13. 13. by extending lib/datasources/Datasource.js Return triple stream given triple pattern, offset and limit ...or write your own datasource implementation 13
  14. 14. Exposing multiple datasources as if it was one Useful for fragmented datasets that can’t be merged Compose multiple datasources 14 https://github.com/LinkedDataFragments/Server.js/blob/master/config/config-composite.json
  15. 15. Publishing with Triple Pattern Fragments TPF server software TPF client-side querying Quad Pattern Fragments Demo 15
  16. 16. http://linkeddatafragments.org/software/ JavaScript LDF client Python Perl Java ... Different client implementations exist as well 16
  17. 17. No installation required at all! http://client.linkeddatafragments.org/ Use the web client 17
  18. 18. Requires Node.js ≥ 4.0 Run from command line or include in your source code Installing and running the LDF client with Node 18 [sudo] npm install -g ldf-client ldf-client <tpf-endpoint-url> <sparql-query-path> Documentation: https://github.com/LinkedDataFragments/Client.js
  19. 19. Requires Docker Installing and running the LDF client with Docker 19 docker pull linkeddatafragments/client.js docker run -it --rm -v <sparql-query-path>:/tmp/query.sparql linkeddatafragments/client.js <tpf-endpoint-url> /tmp/query.sparql
  20. 20. Publishing with Triple Pattern Fragments TPF server software TPF client-side querying Quad Pattern Fragments 20
  21. 21. Quads are triples extended with a fourth element, the graph <s> <p> <o> <g>. Triples sometimes need some context 21
  22. 22. Quad Pattern Fragments (QPF) Adding the fourth element to the interface 22
  23. 23. SELECT * FROM <http://example.org/graph0> FROM NAMED <http://example.org/graph1> WHERE { GRAPH ?g { ?s ?p ?o. } } QPF Clients are able to use quad features 23
  24. 24. QPF is backwards-compatible with TPF 24 TPF QPF TPF QPF Ignores graphs
  25. 25. Quad Pattern Fragments (WIP) https://github.com/LinkedDataFragments/Server.js/tree/feature-qpf-latest https://github.com/LinkedDataFragments/Client.js/tree/feature-qpf-latest 25
  26. 26. Publishing with Triple Pattern Fragments TPF server software TPF client-side querying Quad Pattern Fragments Demo 26
  27. 27. Let’s try overloading the DBpedia TPF endpoint! 27 http://fragments.dbpedia.org/ Uptime of 99.9967%
  28. 28. 28
  29. 29. 29
  30. 30. 30 Some query types are very slow ~1 hour!
  31. 31. Server running TPF server software with any dataset 31 TPF server software RDF dataset HDT dataset SPARQL endpoint ...

×